Distributed Storage
Another consequence of the advent of "the Cloud", the next few years will force developers to think differently about persistent storage of their data on disk. A few years ago, your storage options were limited to using a RDBMS or writing to disk in a file (which, unless you were using some network-based tech, like
NFS was a poor decision more often than not). The future will offer more complex choices, meaning more flexible and more precise decision making.
The modern developer will need to understand the difference between the 3 types of distributed storage:
If you're curious, here's a
brief introduction to the subject.
There are also several technologies that are challenging the traditional RDBMS:
And there are more! All these technologies aren't competing. They're complementing each other to offer a wide array of solutions to different problems. It will be up to the programmer to know them and pick the right tool for the right job. Every single time.
Programming languages of the future
Concurrency will be treated as a first class citizen. The traction that languages like
Go or
Clojure are getting, tells me that in the (near) future, programming languages won't add concurrency features as an afterthought. It will be a builtin feature (and a founding principle) of the language itself. Apps of the future will be distributed by default, leveraging Cloud APIs and modern storage systems. Old-school programming languages like C or Java do not provide the developer with the appropriate tools to deal with the complexity of such systems.
Another big trend I see happening is that static typing is fashionable again. Arguably, the past 5-10 years have seen dynamic languages like Python or JavaScript raise drastically in popularity. One of the features these languages provide is decreased verbosity because of lack of static typing. This impacted programmer's productivity positively and allowed for faster release cycles.
However type theorists are working in the shadow to present better
type inference algorithms and implementations have been crawling into lots of modern languages like C++, Go,
Rust or
Swift. This will release the programmer from the burden of writing the type at each turn, while still allowing compilers to do proper type checking.
Another trend that is tightly linked is that native/compiled code is fashionable again. Go, Rust, C++, Swift but also rediscovered languages like
D or
Haskell are all sporting native compilers. This seems to depart from the past trend of deploying complex runtimes and calling it a "virtual machine" (JVM-style). Are programmers tired of fighting against their garbage collector?
Finally, there are still a few old news that are worth keeping an eye on. It's the continuation of the trends of the past 5 years or so, and it doesn't seem like it's slowing down. Functional programming is still on the rise. Java's recent adoption of lambdas will definitely have an impact, simply because there are so many Java devs out there. Also, JavaScript is still worth following closely. It will be the universal runtime. I'd recommend looking at projects like
Asm.js or Google's
NaCL.
Trends in Virtualization
These trends are specific to the industry I work in, but I think their impact is going to be felt in every other field of programming and IT at large.
The first trend is about
OS-level virtualization, sometimes called "Containers virtualization". This model of virtualization is a great addition to the traditional model of
Hypervisor-level virtualization. In a nutshell, a hypervisor is a software that emulates hardware functions. In order to run a virtual machine, the user would install a separate operating system on top of her hypervisor and have a separate entity to manage. Container technology removes a lot of complexity by relying on the existing OS immediately. Think of it as
chroot on steroids. A process running inside a container can only access a limited subset of resources. These resources cannot be accessed by other containers even if they're running on the same OS/kernel.
The benefits of this technique are huge, mainly in removing a lot of unnecessary overhead. Container technology has been around for a while now;
OpenVZ is almost 10 years old, Solaris and FreeBSD have had them for a long time too (BSD call them "jails"). However a recent project,
Docker, greatly simplifies the way admins, devs and users interact with them. This project is gaining a lot of traction and in my opinion it will heavily influence the way we develop, ship, deploy and host our apps in the future.
The second trend that should be monitored is Network Virtualization. This is the trend of believing that networking hardware is dead and that networking functions should be delivered by software running on commodity hardware. All the major actors in the Industry are now rallied behind the two movements
Software Defined Networking and the much more recent
Network Functions Virtualization to fundamentally rethink how networks should be done.
It is way too early to know how exactly it will impact us, but it is definitely disrupting the way our networks are built. They'll be more flexible, more fault tolerant and overall more adapted to the huge amount of connected devices that we'll see in the upcoming years.
Of all the trends I have mentioned in this (long) post, I think that last one about network virtualization will be the most disruptive and deserves the closest scrutiny.