Tuesday, January 17, 2017

The great developer divide

This will be non interesting to someone who is looking for code snippets. This is gonna kind of theory or history class with some predictions. As always predictions may come true not guaranteed. Lets come to the point. What is meany by developer divide? In simple sense, the developers are going to be divided into 3 species.
  • Application integration developers
  • AI / Algorithm developers
  • System / infrastructure developers
    • Cloud  developers  - includes device driver writers.
Before we think oh there are already 2 groups of developers and are named the same for at least last 10-15 years. What is new?

Evolution in Biology

Lets look at how new species are evolved in biology. If you are an opponent of evolution, you may better stop reading as without understanding the biological evolution, its difficult to understand any evolution including the formation of new languages and all. In biological evolution when some group does different things than the other group of same species or they are separated geographically a new species is formed. If the divide is for small amount of time, they can mingle together and continue as one species. In the other case, there will be 2 species.
Another reason for survival or destruction of species is the ability to suit to the environment. For example, consider the famous evolution scenario explained in England related to industrial revolution.
This is kind of easy and old explanation. The real reason is the gene mutation, which goes to the next level. Only the members who got useful gene mutations will produce more off springs and after a long time they becomes new species which has clear distinction from the old.

Characteristics of these developer species

Coming back to the software, what are characteristics of these new species.

Application integration developers

These are the majority of developers. They write applications which accepts data from user and store somewhere. When user wants the data they just show them or send to them periodically by means of some push mechanism, Some transformations on the data will be done by them which are more towards formatting the data. They never create new data ie knowledge from the data except generation of logs.

More importantly this species of developers will be vanished or become low profile when other species of developers become stronger. It will be difficult for these application developers to convert to the other species though some may succeed and survive. Whatever these developers were doing will become a task of business people. For example business analyst will be able to assemble applications and do most of the customizations. 

Currently there are some environments where we can see this is happening. Visual Studio LightSwitch is one to name.

AI / Algorithm developers

This species will write complex algorithms and expose them as service with the help of infrastructure developers. Application integration developers will consume the algorithms written by this group before they get extinct with the invention of AI programs which does coding.

This type of developers will be high profile, high in number and gets job safety for longer time. They will be knowing basics of application integration but never know what and how the infrastructure developers work.

We can see examples already such as Algorithmia, Azure Machine learning etc...

System / infrastructure developers

This species will know how to deal with the bare metal machine whether it is silicon based or quantum based even carbon based biological machines. They cleverly abstract the hardware from the application integration developers and AI developers.

There will be less number of developers in this species. They control most of the things. They may consume the algorithms written by the other species without knowing how its done.

Reasons for this prediction.

No one can just predict something and run. They should have some kind of reasons for the prediction.

Bare metal is going away from developers

This is something everyone is agreeing now a days. There are many many abstractions coming over the hardware and the main reason is to code once and run everywhere. Another reasons we can hear is the reusability, less development cost etc...


When we were developing for Intel 8085, we had to know what are the CPU registers, their size, memory addressing and port addresses. Now if we ask how many registers are there in the CPU to a new gen .Net or Java or JavaScript developer, they will not have any clue. If we further go back in time, we know there was punch cards and developers had to know so many mechanical & electrical properties of the system. After 8085 days, we got assemblers, compilers and we entered into managed world.
Now from the managed world we are going to the world of integration. Connect some dots and the software will be ready. Even for device programming JavaScript is used which completes the story.

Cloud is abstracting

Another abstraction is happening in the distributed systems world. Distributed means something which is done with 2 Turing machines. or somewhere data is serialized and passing through. Cloud started slowly with IaaS where we got virtual machine. Then came PaaS where we don't need to worry about machines anymore. It is now reached to FaaS where each function is a service and can be deployed independently. Another sweet name is Serverless. Now people are telling that though the name Serverless, there are servers underneath but developer don't need to worry. 

But think of a time where the underlying machine is a quantum biological computer. Still we can call it a server but the underlying system has completely changed and there will be no clue to the consumers how the FaaS executes code.

AI Algorithms are too complex to be understood by all

Earlier colleges were teaching sorting and searching algorithms. But people who studied those were not coding those algorithms everyday as those came as part of the standard libraries. That helped the developers concentrate on the real business problem. That is good. After those days the business requirements got a new face of data analytics. Everyone wanted to analyze their data. Some at least know what they want from data. Some are just fiddling with data for treasure. That brought a good momentum to the field of algorithms. Many algorithms were developed but the same question came again. Should we just consume those algorithms like how we are consuming sorting and searching or should we learn those algorithms. Obviously the route is same. The consumption. 

Another reason is their complexity. It is very difficult for normal application developer to know how face recognition algorithm works as it requires prior knowledge in different fields. So the consumption is easy. This eventually boosts the developer divide.

There may be many more reasons which are not listed here. However the developers are getting unavoidable division which occur in every growing science and technology fields. The best practice is to select own field and be an expert in it.

Thanks for reading.

1 comment:

Blogger said...
This comment has been removed by a blog administrator.