Living in an ever changing world, technologies which were once considered miraculous becomes mundane in a few years. Here’s a quick checklist of the miraculously mundane which you may have missed.
The 2010s saw technologies once considered futuristic become ubiquitous. It was the decade mankind discovered water (fine signs that water had once been there) on Mars. It was the decade the stars came within our reach. And it was the decade immortality (or at least longevity of a magnitude unthinkable before) became a possibility. We absorb and adopt technologies rapidly, and what was once considered miraculous becomes mundane in a few years. Here’s a quick checklist of the miraculous, the mundane, and the miraculously mundane.
3D Printing isn’t a concept born in the 2010s. Indeed, it was first suggested in the early 1970s. However, the decade that will soon end was the one in which it became a commercial (even household) application, additive manufacturing came of age, and it was able to even use metals for additive manufacturing.
LiDAR and Tesla Auto Pilot
LiDAR, which stands for light detection and ranging (and which is decades-old), is a laser-sensor combination that basically illuminates surrounding objects by laser, and measures their reflection using sensors to understand entire atmospheres or environments. In the 2010s, it became, quite possibly, the core technology that made autonomous vehicles possible. In 2014, Tesla released the autopilot software for the Tesla S, marking a milestone in the recent history of self-driven vehicles.
Sure, like many of the technologies mentioned in this page, the smart phone has been around since the second half of the 2000s, but it was only in the 2010s that it became ubiquitous, overtook and then overwhelmed so-called feature phones in terms of numbers, and became more powerful than the first supercomputers. This was complemented by 4G and Long Term Evolution (LTE), which made it possible for rapid, wireless transmission of heavy data files, including video content.
Cloud & IoT
Again, the Cloud isn’t a concept of the 2010s, but it was in 2010 that it really became commonplace, and, more importantly, critical and necessary even to individuals (as opposed to enterprises). Coupled with faster wireless technologies (such as 4G and LTE), and powerful smart phones it has changed the face of almost every industry. Meanwhile, the Internet of Things (as it is called) matured in the course of the decade, and everything from fitness trackers to cars to shoes to CT scans.
Biometrics and Fintech
The 2010s saw facial recognition technologies become commonplace – indeed, the move from fingerprint to facial recognition on the iPhone indicates the technology is at the mass-adoption stage – and, among other areas, this and other biometrics are used in fintech, which, again, isn’t a child of the 2010s, although it definitely reached criticality in terms of volume and business models in the decade.
Blockchain became popular thanks to Bitcoin, but its applications transcend cryptocurrency. Blockchain is a chain of public ledgers, and each block of the chain (hence the name) has a cryptographic stamp of the previous block. Often the entire chain is managed by distributed computing architecture, making Blockchain extremely secure. It is already being used in contracts and fintech. And its use in things such as supply chains and energy grids has been demonstrated.
The new space race, exemplified by Jeff Bezos’ Blue Origin and Elon Musk’s SpaceX is in reusable rockets. Again, this is not a new concept, but it was only in the 2010s that significant progress was made on both. In 2015, Blue O’s New Shepard became the first Vertical Take Off and Landing rocket to reach outer space and return, albeit through parachute landing. Just about a month later, SpaceX achieved the first soft-landing of a rocket’s first stage. Space X now has the Falcon 9 and Falcon Heavy reusable rockets. Why are they important? How else will we colonise space!
It was always suspected, but in the 2010s it became clear that the world was set to see temperatures cross the 1.5 degrees Celsius -over-pre-industrial-level benchmark for sure, and perhaps even the 2 degrees threshold if countries didn’t start focusing on the climate crisis. When a similar listicle is drawn up for the decade of the 2020s, every technology will be related to this.
Of all ailments, cancer perhaps highlights human mortality and the limits of medicine more than any other. In 2018, Jim Allison won a Nobel Prize for what can only be described as the holy grail of medicine – a cure for cancer. Again, the concept isn’t new. Allison and his colleagues discovered the protein that inhibits the immune system’s capability to fight cancer cells and, in the mid-1990s, discovered that anything that targeted this protein acted against cancer (in mice). Human testing in the 2000s reconfirmed this but progress was slow, and it was only in 2015 that the US drug regulator approved an anti-CTLA4 (the protein in question) treatment for a type of cancer.
In March, Wired magazine ran a cover story called Welcome to The Mirrorworld in which it spoke of how mirrorworld will be the “third big” platform after the “web” and “social media”. Mirrorworld is a mixture of Augmented Reality and Virtual Reality that marries the real with the digital, creating a new layer on top of the existing physical world. The article spoke of how Pokemon Go was among the early glimpses we have been provided of the mirrorworld. Indeed, if the 1990s and 2000s were all about the virtual, the 2010s heralded the arrival of a mixed real + virtual world.
Lithium Ion batteries
This is more a challenge than a technology. By the middle of the 2010s, the Lithium Ion battery had become among the most critical components of anything that required a power source. But the current technology, especially because it uses Nickel and Cobalt (extraction of the latter is a hazardous process), is environmentally unsustainable. However, given that everything, including vehicles are now becoming battery powered, the world needs to quickly come up with a new battery technology that can be replicated on a mass-scale.
Machines (and software) that think have been the subject of conjecture and speculative fiction for decades. In the 2010s, deep learning, Big Data, and higher computational power made it a reality. They can learn, do some things better (and faster) than humans, but they still can’t think.
A concept of the 1990s, Big Data is about technologies that can process unimaginably huge volumes of data that standard data analysis technologies cannot, and then cull insights from them – perhaps even in real-time for instant action. Big Data came of age in the 2010s thanks to machines with higher processing power, and also because the rapid rise of smart phones and data networks meant that a lot of data was being created. It has applications in everything from governance to banking to national intelligence. Interestingly, Big Data, in combination with deep learning (essentially technologies that make it possible for machines to learn), made Artificial Intelligence a reality.