On February 15, 1946, the Moore School of Electrical Engineering in Pennsylvania, USA unveiled the Electronic Numerical Integrator And Computer (ENIAC). The machine, which was developed between 1943 and 1946 as part of the American war effort, is widely recognized as the world’s first general-purpose electronic digital computer.
Computer company Unisys can trace its lineage to Univac, which was a commercial machine built by ENIAC inventors J Presper Eckert and John Mauchly.
Unisys ClearPath Forward CTO Jim Thompson is an ENIAC history buff, and ahead of the 75th anniversary, he told Computer Weekly why people are fascinated by old hardware. “People are way more into computing history than I am,” Thompson said. “It’s generational. My parents grew up without a computer. My generation has seen many changes. The story has appeal and the roots are always important.
Thompson says her children, who are in their thirties, have always had computers in their lives and when they were young, “they wrote programs – it’s like TV for my generation.”
For Thompson, the difference between the generation raised with computers and the previous generation that grew up when the primary form of entertainment was television is that computing has become stealthy and pervasive in people’s lives. “People think TV is something you do,” he said.
According to Wikipedia, the cost of ENIAC in 1946 was $487,000, which in today’s money is about $6.53 million. But throughout the history of computing, many have raised questions about the economic value of computing technology.
Since the late 1960s, Moore’s Law has been the formula used by industry to drive progress. Every 18 months to two years, the industry is able to sell devices with twice the computing power for the same price. This is measured by the volume of transistors that can be put on a chip. More transistors usually equals more computing power.
But Thompson has a theory about it and says, “Moore’s Law has been exaggerated or misinterpreted. It’s really about transistor technology and the relentless pursuit of more transistors on integrated circuits. But there is a corollary: as we do little things, the costs increase more and more.
Last year a Computer Weekly reader came across the first computer he worked on, which was an IBM 360 system from the early 1970s. At the time, it could be rented for around £8,000 by year. In today’s money, that works out to £87,339 a year.
A company needing a mainframe in 1973 would probably have been quite large. That figure of around £87,339 per year is now what relatively small businesses need to spend on IT.
Although, in real terms, the cost of technology has increased, Thompson believes investing in technology still creates value. “The work that institutions are doing on enterprise hardware today is not the same as it was 50 years ago,” he said. “We put banking on a phone, for example.”
This requires software services that wrap around the core banking system to provide a mobile-friendly user interface, he added. “It takes a lot of computing power to loop around a basic banking application.”
Loss of programming skills
Although such advancements allow everyone to have access to advanced computer systems, Thompson believes programming skills are being eroded. “We don’t teach people how to do things anymore,” he said.
One example is the trend of using low-code/no-code tools to reduce the skills needed to build useful business applications. But, Thompson says, “The tasks are automated, which they shouldn’t be.” In other words, automation does not necessarily help develop programming skills.
Unlike modern computer architecture, ENIAC did not use memory; instead, it included a series of modules for performing calculations. Recalling his programming experiences, Thompson says, “I remember working on mainframes with 6MB of memory. We had to code darkly to get the most out of the machine.
“Programmers today code less than before. Memory is unlimited, I/O [input/output bandwidth] is not a problem. In a traditional sense, we no longer do data processing. Today’s programmers create applications.
The fact that computing resources are now considered unlimited and, thanks to Moore’s law, are getting faster with each new generation, has caused some programmers to take the technology for granted, Thompson said.
He worries that programmers will lose the skills needed to program in resource-constrained hardware and the ability to understand low-level coding, like the architecture of operating systems.
“We’ve been through the operating systems bubble,” he said. “But now there are a few mainframe, Unix, Linux and Windows operating systems. Programmers are not interested in building an operating system.
In the past, low-level coding techniques and an understanding of the operating system were usually prerequisites for making a piece of code “work”. But today, Thompson said, it all depends on the app being built, and any required functionality can usually be downloaded from the cloud.