2023 Review:

A Year of Innovation and Disruption

in Software Development

By Ben Hall, Vice President Technology | Published: January 3, 2024 in Blog

It’s not all generative AI 

Software development is always changing and that continued to be the case in 2023. ChatGPT and generative AI absorbed most of the attention, but there were other interesting developments:

  • We keep seeing interest from our clients, particularly large enterprises, in low-code/no-code solutions. We see the potential in these solutions. However, rapid and valuable delivery can be derailed when the requirements aren’t addressed nicely by the palette of solutions provided by the tool. I suggest that good design needs to accept those limitations and fit solutions within those constraints or recognize early that the tool isn’t a good fit and use custom software.
  • On the server side, despite our continued support for Moleculer, our go-to server side JavaScript framework this year has been NestJS. Furthermore, TypeScript has dominated JavaScript in our recent projects.
  • Intelliware has been using OpenTelemetry and on both internal and external projects. For Spring Boot-based projects it can be as simple as adding a JVM argument to start using Azure Monitor Application Insights, or as slightly less simple as adding a JAR to the pom.xml and a single line of code.
  • Since Intelliware does a lot of Java, I expect we’ll soon see opportunities to use Java 21, the latest Long Term Support (LTS) version. Version 21 is the successor to 17 as LTS. The reality is that in 2023, Java 11 (released in 2018) was most used in production, followed by Java 8 (released in 2014—almost 10 years ago). The Java community has been extremely slow to adopt new releases, but that trend seems to be changing. Java 17 went from 1% adoption in 2022 to 9% in 2023, which isn’t a lot, but much faster than Java 11’s adoption rate.
  • We’ve had our first project this year using Web Assembly. Our team took a C library that was previously used only in native Android and iOS apps and targeted Wasm using Emscripten so it could run natively in the browser. This is more fuel to throw on the fire for Chris Ford in his campaign: “You think you need a native app but you don’t.”
  • Our experience with Micro Frontends for web user interfaces shows a variety of ways to implement this architectural pattern. Regardless of the technical solution, we’ve learned that the goal of this pattern is to keep separate teams decoupled from each other, enabling them to deliver features without blocking each other. If Conway’s Law isn’t motivating a Micro Frontend strategy, it might be the wrong approach.

But there was a lot of generative AI 

The explosion of generative AI and related tools has been amazing. There are two main interfaces that I’ve been interested in in terms of supporting software developers: chat and IDE plugins. For chat I personally prefer Bing Chat over ChatGPT for the inclusion of current search results and footnoted responses. And since Intelliware is on the Microsoft 365 ecosystem, we have easy access to Bing Chat Enterprise. As for IDE use, it’s still early days but our expectations of GitHub Copilot are high.

Our biggest concern using these tools has been around protecting the privacy and intellectual property of our clients, and at the same time not violating the IP of others whose material has been included in the training corpus. As of recently, the Canadian banks had specifically disallowed the use of such tools while they undertake due diligence with research and investigation. However, just this week one our enterprise clients has cautiously started rolling out Amazon’s CodeWhisperer.

Your AI Pair Programmer—”You keep saying that word” 

GitHub Copilot’s tag line is “Your AI pair programmer.” For those of us who have a long experience in Extreme Programming, that sounds misleading. While pair programming can help fill knowledge gaps and get more code written, it’s about a lot more. Pairing supports communication within the team, spreading a consistent view across the team, and building the team’s overall capability. As Birgitta Boeckeler said in a recent podcast, “it’s about having the context of what’s going on, knowing this person wasn’t at story kickoff.” I’m not convinced that a tool that’s embedded in the IDE and providing code completions is going to fulfill that role, at least not yet. 

Opacity vs. One Level of Abstraction Down 

When using tools and frameworks, the general advice is to “understand one abstraction below the abstraction you’re working in.” E.g., when using Spring, you should understand proxies and dependency injection, but the framework saves you from the heavy lifting of doing that work yourself. An objection to generative AI tools is that opacity of the tools is too high—we can’t see how the model works to create the suggestions. However, at least the way tools like Copilot work today, we use and manipulate the output directly and therefore have less need to understand how the tool came to that output. To be explicit, I’ll say that it would be unprofessional to take the generated code and use it directly without understanding the details of the created code, but it isn’t necessary to understand how Copilot and the LLM mapped your prompt to that output. Developers should use the suggestions with open minded skepticism. As with Stack Overflow, the code you find is probably helpful, but you must exercise your judgement and be careful with the answers. 

For a while I’ve thought about the progression of technology in software development in terms that I first saw in C. J. Date’s book, “What not how”. He makes the point that the progression of different generations of languages, from assembler to 3GLs and 4GLs is a progression from procedural to declarative. E.g., prior to SQL, to do the equivalent of a join with indexed files would involve opening a file and seeking to a particular record to get a field, and then using the data in that field while opening and seeking records in another file. To join using SQL, we tell the database what we want and the database figures out how to do it. However, that progression from procedural to declarative required a precise and deterministic abstraction: we say exactly what we want, and the database returns exactly what we asked for. Whether we’re talking about code generators or no-code/low-code frameworks, this need for precise abstraction still applies. This was the framing that I brought to my understanding of generative AI. It leads to questions like, “Should we treat our prompts like source code?” 

However, after listening to Birgitta Boeckeler on the Thoughtworks Technology podcast, I changed my mind. Prompts to ChatGPT do not require precision, are not abstract, and the model is not deterministic in its output. As Boeckeler says, this is a change of mindset and means that “we still have to be in control and figure out what to do with the suggestions.” 

Old and New Entering the Future

It’s an exciting time to be a software developer. We have new tools that would have been difficult to imagine only a couple of years ago like generative AI, and renewed life in old tools like Java, Spring, and text-based diagramming. The great opportunity is to combine these tools in innovative and disruptive ways as we move forward.

More recent stories