Skip to content
Search
Generic filters
Exact matches only

10 Predictions about the Software Development trends in the 2030s | by Md Kamaruzzaman | Jul, 2020

Image by Pete Linforth from Pixabay

In 1982, Noble Laureate physicist Richard Feynmann published a paper, “Simulating Physics with Computers,” and Quantum Computing is born. Unlike Classical computers, it uses quantum mechanical phenomena like superposition and quantum entanglement for computing. In simple words, transistors of digital computers can be in one state: either Zero or One.

But the basic unit of Quantum Computer, Qubit can be in the state of Zero or One or in linear combination of 0 and 1. As a result, the power of Quantum computer increases exponentially with every additional Qubit. Quantum entanglement means that once entangled, two Qubit can exchange information instantly even if one is on earth and anothe is on Andromeda Galaxy. If we think digital computer as average human being, then Quantum computers are like super intelligent aliens who can play chess with millions of players simultanesouly or can solve millions of math problems simulatanesouly.

In the last decade, lots of advancements happened in Quantum Computing. In October 2019, Google claimed that they had achieved Quantum Supremacy with its 54 qubit Sycamore Quantum computer although the news was challenged by IBM, its competitor in Quantum Computing:

Last month, Honeywell claimed that it had created the most powerful Quantum Computer with Quantum Volume 64:

Currently, Quantum computing has several challenges: the QPU needs to put in almost absolute Zero temperature, and Quantum computers produce huge errors due to quantum decoherence. In the next decade, Quantum Computing will be the hottest research topic as large corporations, and World Superpowers will vie for Quantum Supremacy.

In the 2030s, quantum computing will be significantly advanced, and engineers will use it in real-world scenarios like quantum physics simulation, weather forecasting, drug development, financial modeling, artificial intelligence, traffic optimization, interstellar travel. It will also work as a Catalyst in the advancement of other fields like AI. With Quantum Entanglement, we can finally see the end of the CAP theorem in distributed computing as Quantum entangled particles can share information instantly.

In the early 2030s, Quantum Computing will start threatening the Classical cryptography and related fields like financial transactions, Blockchain. So, there will be massive changes and agitation in the industry as everyone will try to replace classical cryptography with Quantum Cryptography. By the late 2030s, Quantum Computing will finally break the classical cryptography, which may cause substantial social uproar like Wikileaks as many secure and sensitive communications will be decrypted.

All in all, 2030s will be the decade of Quantum Computing.

Image by Tumisu from Pixabay

In 2006, Amazon took the giant step in Cloud Computing by offering three AWS Cloud services: EC2, S3, and SQS. In the next 14 years, Cloud computing has become pervasive and widespread. Initially, it was the startups who embraced Cloud computing. In recent years, Government Organizations, Health Care, Mining, Banks, Insurances, and even Pentagon are moving to Cloud. The COVID-19 crisis showed us that companies need to adapt Cloud not only to scale up/scale-out but also to scale down. Canalys has reported a 34% increase in Cloud spending in Q1, 2020:

In the next ten years, Cloud computing will be omnipresent and ubiquitous in Software development. Also, the current issues of Cloud computing (e.g., Security) will be resolved in this decade. Google’s attempt to unify Cloud Stack via Cloud Native Foundation will gather more steam, and many Cloud Services will be standardized by 2030.

During the 2030s, Cloud Computing (Public/Private/Hybrid) will be the “normal” way of software development. Also, the on-prem data-center will use the “Standardized” Cloud stack or Vendor-specific Cloud Stack.

Due to the physical requirements of Quantum Computers, we will use Quantum Computing and Quantum Artificial Intelligence only in Cloud in the 2030s. By the late 2030s, Quantum Encryption will be mature and will give an unbreakable and robust security mechanism for Cloud computing. Whether we like it or not, Cloud computing in the 2030s will be centralized, and only the Big Tech companies will dominate it like today.

Image by Gerd Altmann from Pixabay

Artificial Intelligence is one of the earliest disciplines in Computer Science but faced several setbacks during the AI winters. After the second AI winter (the 1990s), the first major groundbreaking event in AI happened in 2012 when Andrew Ng in Google Brain project trained a neural network of 16000 CPU node with 10 million unclassified pictures taken from YouTube videos, and it can detect a Cat. Another groundbreaking event occurred in 2016 when Google’s AlphaGo AI defeated the Go world champion as Go’s possible board position is more than the total number of atoms in the Universe.

Also, the modern Hardware (GPU) and Cloud computing turbocharged the AI algorithms, which lead to many innovations in Machine Learning/Deep Learning in the last decade. Alexa/Siri, Spam detection, Fraud detection, Autonomous driving, shopping recommendation, music recommendation are only a few of the vast AI applications we are using every day.

In the next decade also, significant innovations and breakthroughs will happen in AI, especially in Reinforcement learning. AI will start eating the world in the 2030s. Contrary to popular belief, AI will aid humans instead of replacing humans. We will drive autonomous cars. The doctors will use AI for better treatment. Life Science companies will use AI for better drug development. Even as developers, we will use AI-driven Operating systems with AI-driven applications.

In the 2030s, AI will be wholly explainable and interpretable, unlike today. By then, AI will not only be able to find a Cat but also can explain or understand why it is a Cat. Breakthrough in Quantum Computing in the 2030s will significantly boost AI as Neural Network models will train on the fly and use the trained model instantly with the help of Quantum Computers. I expect we will see the AI Singularity i.e., it will continue to improve in a runaway fashion without human assistance.

AI and Quantum Computing will take Human Civilization to the next level (Industry 5.0) in 2030s.

Image by Gerd Altmann from Pixabay

In May 2018, Sundar Pichai (CEO of Google) unveiled the AI-based Smart Compose feature for Gmail and Google Docs. Initially, I was skeptical about its usefulness. Only after two years, I find it amazing how smart compose feature helps me speed up my typing. For the Software developers also, there are AI-driven Code completions plugins/extensions available for major IDEs like Codota or Kite. AI-driven test generation tools or AI-driven End-to-end testing tools are also gaining popularity.

Although these AI-driven Software development tools are not yet powerful, they will be a hot topic in the coming days. Any improvement in those tools will make a huge productivity gain in software development. With an exponential increase in the number of open source projects in the future, the AI-driven Software Development tools will have more training data and will only get better and better.

In 2030s, Software Developers will get lots of support from various AI driven Software development tools like code completion, test generation, end-to-end testing, database modeling, CI/CD etc. Developers will just need to define method names and fields and AI will generate the Source Code incluing Unit/Integration/Acceptance test. Also, developers will write/speak the functionality of the Project/Classes in plain English and AI will generate the Source Code with CI/CD and Integration/Acceptance Tests. With the aid of AI, developer experience and developer productivity will be much higher in 2030s compared to today.

Image by Виктория Бородинова from Pixabay

Software developers are the most scarce resources at present. If you are an entrepreneur and have a great idea, then you have first to hire a few developers to implement your first product or MVP. Even if you are a Project Manager/Product Manager in a Software company, you still need a Software Development team to develop your MVP. Also, you have to wait a substantial amount of time (several months to a year) to have the first glance of your MVP.

In the last few years, a new movement LCNC (Low-Code No-Code) is gaining traction, which is trying to reduce the barrier to product development. Currently, there exist many excellent LCNC applications that enable to develop the first product in a short time without any software engineers. Bubble, Huddle, Webflow offers speedy Web Application development. Kodika offers iOS app development without any Code. Parabola is a no Code Data Workflow platform, whereas Airtable is an LCNC database-spreadsheet hybrid. There is also an LCNC platform for AI/ML.

Current LCNC platforms need to go a long way to develop highly flexible, industry-grade applications. If we think about industry grade applications as Lego Mindstorms, then the current LCNC applications are like LEGO Duplo. In the next decade, LCNC platforms will evolve immensely. In the 2030s, there will be a plethora of mature LCNC platforms that can create industry grade applications. Entrepreneurs or Business executives will develop 80–90% of their consumer applications MVP using LCNC. There will be a few powerful AI-driven LCNC, which even Software Engineers will use to start a new App development. So if you have fresh ideas but have no money or no coding experience, the 2030s will be an excellent time for you.

Photo by Max Duzij on Unsplash

During the 2000s, we thought that we do not need any new Programming language. There was C/C++ for System programming, Java for Business application, PHP/JavaScript for Web Development, Ruby/Python for Scripting. But in the next decade, we have seen many innovations and breakthroughs in the programming language landscape. The success of Rust proved that with an innovative idea, a new language could challenge even the mighty programming languages like C/C++. Similarly, Go also showed that it is possible to create a simple yet powerful and successful programming language. Swift, TypeScript, Julia, Kotlin, Dart, Elixir, Crystal, Elm are also popular and widely adopted programming languages developed in the last decade.

In the coming decade, we will see even more innovation in the programming language landscape. Programming languages originated in the last decade will be even more popular, whereas many other new programming languages will hit the scene. In the 2030s, the programming language market share will be much more fragmented and even compared to today. Rust will replace C/C++ as the numero uno System programming language, whereas Julia will replace Python as the de-facto language in AI. With AI-driven software development and innovative tools, Polyglot programming will be the norm rather than the exception in the 2030s.

I expect dozens of new Cloud Native programming languages will be mainstream in the 2030s. Also, there will be a few more modern programming languages for Quantum Computing.

By 2030s, WebAssembly will be the de-facto Bytecode format to run on Web or Smart Devices with Multithreading programming model support. It will allow writing Consumer applications (e.g., Web, Smart devices) with any languages taking the full advantages of underlined Hardware (e.g., GPU). As a result, powerful and near Metal language like Rust will be used to develop Gaming, 3-D, AR/VAR apps, or other CPU intensive apps targeting Web, Smart devices. Also, in the 2030s, the browser will be the Operating System, and almost 100% of consumer desktop applications will run on browsers.

Image by marijana1 from Pixabay

If we have learned one thing from the last decade of the Software Development industry, it is “One size does not fit all.” But in the Hardware Development, it is still “One size fits all.” Today’s Software applications are so vastly varied that there are many cases where specialized Hardware can give a significant advantage over generic Hardware. There are several examples of successful domain-specific Hardware development in the last few years. There is specialized Hardware for Bitcoin mining that can calculate the SHA more efficiently. Google has developed a specialized GPU (TPU) optimized for running TensorFlow. Another very successful specialized Hardware is Amazon AWS Nitro, which is a specialized Hardware for Containerization/Virtualization and helped Amazon significantly in their Serverless, EC2 platform.

In the next decade, we will see increasingly more specialized computing Hardware. In the 2030s, there will be a high number of specialized Hardware: special Hardware for Database, special Hardware for AI, specialized Hardware for Data Processing, and so on. Currently, Hardware development resembles the software development of pre-2010, which led to a long release cycle. In the 2030s, Hardware development will incorporate many best practices from Software Development. It will use Agile methodology with Cross-functional teams where Hardware Engineers will work together with Domain-specific Software Engineering. As a consequence, the Hardware release cycle will be shorter, which in turn will produce more domain-specific Hardware.

Image by Gerd Altmann from Pixabay

Currently, we are facing the Two Database problem. According to the CAP theorem, we can only have any two of the CAP (consistency, availability, partition tolerant) for a distributed system/distributed database. For applications where consistency is the essential requirement (e.g., Banking, Insurance, and most other business applications), we use SQL databases (OLTP), which offers CA of the CAP. For highly scalable applications where availability is the crucial requirement over consistency (e.g., Analytical workload, social media like workload), we use various NoSQL databases (OLAP). Moving Data from OLTP to OLAP requires lots of work. Also, SQL databases offer a single abstraction layer for the data (SQL), whereas NoSQL does not provide any single abstraction layer.

In the last few years, we have seen the rise of the Distributed SQL (NewSQL) databases, which combines the consistency of SQL with the scalability of the NoSQL database. Although many of them (Cockroach DB, AWS Aurora) are gaining lots of traction, there is room for improvement. In the next decade, we will see even more innovation in the Distributed SQL field.

In the 2030s, we can see real distributed SQL because of innovation in many other areas (e.g., specialized Hardware, quantum computing). One idea could be the “Quantum Entangled SQL Databases,” where clusters of quantum entangled SQL databases will offer the consistency of SQL databases even when one Database is on earth, and another database is in Mars.

Image by David Mark from Pixabay

In the last decade, we have seen an explosion of Data-Intensive applications. We have Batch Processing tools (Spark, Hadoop Map Reduce), Stream Processing tools (Flink, Strom), Queuing (Kafka, Pulsar), Full-text search tools (Solr, Elastic Search), Caching(Redis), Column Store (Cassandra), Row Store (SQL Databases). The downside is that there is no SQL like abstraction for data processing today. Currently, finding the right Data-Intensive tool for specific Data modeling is a daunting task.

In the next decade, we will see the convergence of many Data processing tools, which will offer unified Data modeling for both Batch and Stream processing. In the 2030s, we will find Data-Intensive applications less fragmented and more unified. We will also see tools that will abstract many data modeling (e.g., streaming, full-text searching, caching, column operations, row operations) in the same Data processing framework. Also, we will see the Data-Intensive applications more composable (like Unix) so that we can easily plug multiple applications. In the 2030s, developers will connect a distributed SQL database with Full-Text search engine with just a “pipe” like operator.

Image by mmi9 from Pixabay

Desperate times often lead to disruptive innovations. During the 2007 global financial crisis, Satoshi Nakamoto combined 5/6 existing technology (Hashing, hashcash, public-private cryptography, peer-to-peer network) and created BitCoin, the first cryptocurrency. The success of BitCoin led to the rise of new Technology: Blockchain or Distributed Ledger. In the last decade, many advancements happened in Blockchain Technology and opened the door to use Blockchain in non-cryptocurrency use cases. One such innovation is Ethereum, where a block is a piece of Code. The advantage of Ethereum is that the piece of Code is generic i.e., Smart contract, and can be mapped to anything including cryptocurrency.

Although Blockchain is a disruptive technology, it has many limitations that are hindering its mass adoption. In the coming decade, we will see many innovations in Blockchain, and many of its limitations will be resolved. In the 2030s, Blockchain will be well-established Technology. It will be used in many fields, which are Contract/Transaction-based and centralized: Financial Transaction, Real Estate contracts, Oil and Gas purchase, Supply chain, copyright, sharing music. During the 2030s, Quantum Computing will start to threaten classical Encryption. As conventional Encryption is key to Blockchain, it will go through significant changes during the 2030s and will adapt Quantum encryption.