At the start of my career I worked with fairly dumb terminals and mainframes. Programs were natively compiled but resources were limited and shared with the other users of the system. Shared computing power meant that we favored languages that were efficient typically compiling to binaries.

As personal computers made their appearance developers and users had much more computing power at their disposal. That power was reflected in the servers that powered the early internet. With the increase in computing power both memory and CPU meant that less efficient but more effective runtimes could be supported. Java, .NET, Ruby all benefited from this additional power.

Software development is a continuously evolving landscape of capabilities and I think we are a new juncture.

Light weight containers and the rate of change with cloud providers the drivers for efficiency are becoming more and more important.

Cloud vendors and their customers are realizing that compute density, efficiency and startup speed are becoming more and more important. AWS lambda support for Rust and C++ is an interesting move. With first class support for efficient natively compiled applications the vendor wins by reducing waste in spinning up new instances and cloud customers can maintain a lower cost for runtimes that may balance out the higher cost of development.

Developer Workflow

It has always been important for developers to run and evaluate their work quickly. As capabilities move from application/service code into infrastructure it becomes more difficult to run a complete stack locally. As cloud providers develop more and more compelling cloud capabilities the ability to run a complete application on a development machine becomes harder. Maintaining a local development environment with the complexity of a cloud is both costly, complex to maintain and unlikely to be completely 'production like'.

As an alternative to 'runs on my laptop' there is a growing trend towards it runs on my environment - a cloud based extension to the developer workstation. In this model code is written locally, built but then deployed to an environment for evaluation. It’s unsurprising that the lines between local IDE development and cloud runtimes are blurring with inbuilt support for remote deployment and debugging.


Software developers are now faced with a dilemma. Continue to develop test frameworks and stubs that emulate the runtime environment of their applications or embrace the rapidly evolving environment provided by the cloud vendors for development. Command line tools and integration with IDEs improves the impediments to using cloud services as an extension to the developer workstation becomes ever more tempting. Developing test rigs that run on the workstation only become available later. As the capabilities develop trying to mimic an entire cloud providers services becomes more expensive.

It has been a long time since I last developed 'on a mainframe' as development environment. How well does the modern day cloud environment match up to developer expectations for local development and test?