This is in reference to Stack Overflow Podcast #65. Assume a typical 60's or 70's server computer with, say, 256k main memory. How large (compiled) COBOL programs could such a machine run at maximum? How severely would this limit the complexity and capabilities of COBOL programs, assuming that the programs are not deliberately made more complex than necessary?
IBM mainframe operating systems supported virtual storage back then - although you could buy a condo on the beach today for what the yearly IBM lease was! I don't remember any insurmountable program size issues.
One thing to consider is that back then almost everything was run in "batch programming" mode. This limited how complex any one program needed to be. One program would pre-process the data and store it on disk. The next might sort it and add some calculated result. Then next might update a database. Then the last one in the batch might print out a report. So complexity (and size) was broken up over several programs running in sequence.
Fairly large cobol programs can run in 256K ram in a 70's mainframe. (256K of memory in an IBM 370 was 256K 32-bit words, not bytes.) IBM introduced Virtual Memory around 1970. This paged the program and data to disk, allowing a program to use most of the 24-bit address space, with limitations. Just like Windows!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With