If Covid-19 has taught us anything, it is that the march of digital transformation in today’s economy is relentless. With insiders describing the pace of change as being “two years’ worth of digitalisation in two months”, the post-Covid-19 future will likely send the already rapid pace of digitalisation into overdrive. Technology stocks are, of course, booming.
“2020 was a year of acceleration as we have seen,” says Amit Midha, president, Asia Pacific and Japan (APJ) and global digital cities, of Dell Technologies. “We had a front-row seat to the overnight digital transformation and drastic changes to how we work, learn and live. Today, despite us working from home, businesses have continued to run smoothly. That is the power of technology.”
Dell has profited from stronger digitalisation. On Nov 24 last year, the company reported a net income of US$881 million ($1.17 billion) for the three months ended Oct 30, up 60% y-o-y, on the back of a 3% y-o-y growth in revenue to US$23.4 billion. The growth was largely driven by higher demand for home computing used for remote working and learning, amid the pandemic.
“Most households had computers. But if you were a family of five, you might not have had five computers. And you might not have needed it before Covid-19,” Dell’s global chief technology officer (CTO), John Roese, said at the “Perspectives — 2021 and beyond” APJ media briefing on Jan 8. Beyond selling more notebooks in the pandemic year and joining the AI hype, Dell has identified four under-reported technology trends, with three potentially impacting society as early as 2021.
Trend 1: The future is quantum
Quantum computing — long the stuff of science fiction — could take a step closer to becoming a reality in 2021. While quantum computing for general use remains some way off, this year, computer scientists with no prior access to quantum computing will be able to enter a quantum simulator and learn quantum computing languages like Q Sharp. This is because of the wider availability of quantum systems in public clouds, industrial companies and government labs.
“This will be the year that will enable broader software development ecosystems to experiment with quantum computing,” says Roese, who believes that improved access will be the first step in devising viable use cases for the technology that could become relevant in the near future.
Quantum computing involves computers making use of the quantum mechanics of sub-atomic particles (qubits) to deliver huge leaps in processing power, possibly outstripping supercomputers running on binary codes. While these are unlikely to eliminate conventional computing, they can better solve optimisation problems and accelerate scientific and industrial research.
Andrew Shipilov, professor of strategy at Insead, sees quantum computing powering driverless cars by scanning its environment to determine optimal driving behaviour. Smart cities, he writes in Insead’s Knowledge publication, could use quantum computing to optimise travel times for commuters and manage urban congestion. Other quantum computing applications include simulating molecules for pharmaceutical research.
Trend 2: Semiconductors re-organised
Semiconductors have been top of the news recently, owing to their status as a strategic component. The US-China trade war has revolved strongly around this industry due to the indispensable role that semiconductors play in almost all things electronic. But Roese notes that with computers not scaling quickly enough to meet industry needs, the very way that semiconductors are used and arranged must change to adapt to the needs of a more digital era.
“We’ve been on a move to change the way we think about computing, from homogeneous compute to heterogeneous compute, for a number of years,” he explains. While homogeneous computing — where all software works on a single computer architecture — has worked well and enabled new technologies like cloud, the need to scale computing more rapidly requires a shift to heterogeneous computing. Specialised computing based on domain-specific architectures is needed to make functions like AI and cryptography run at higher efficiency and performance levels.
Currently, most computers run based on x86, a family of instruction set architectures that powers most computing devices, ranging from desktop personal computers to laptops to workstations and servers. In the era of heterogeneous computing, however, x86 will increasingly be augmented by a rich ecosystem of domain-specific accelerators like AI chips. And it is this change in computing, Roese predicts, that will cause a fundamental shift in the semiconductor ecosystem as manufacturers adapt their products to fit into the new heterogeneous architecture.
In tandem with the improvements in hardware, software is seen to be better integrated to the x86 core to ensure that the different components function better as a coherent system. Dell has been working on developing such software since 2019; Roese sees this change happening both technically and at an industry level in 2021. Kurt Cagle, community editor at Data Science Central, thinks that it is a matter of time before heterogeneous computing becomes the norm.
Trend 3: 5G as a service
As Roese readily admits, 5G, fifth-generation mobile networks offering wireless data downloads of up to 10 gigabytes per second is, strictly speaking, not new. Last year was the “first year of 5G”, with such networks gradually rolling out across the world. But what people are not talking about enough, argues the Dell CTO, is that the use cases for 5G are about to change as enterprises take over from consumers as the main users of this transformative technology.
“The first wave of 5G was really an extension of 4G,” says Roese, describing the main achievements of 5G to date as giving users slightly quicker broadband access. However, subsequent improvements in 5G standards are able to bring down the latency from tens of milliseconds to as low as 10 milliseconds — a truly standalone 5G technology, sufficient for driverless cars to move from blueprint to reality.
With these new enterprise use cases dominating the 5G landscape, such connectivity will increasingly no longer be the preserve of proprietary-based telco networks. Rather, it will be cloud and IT that will define the architecture of 5G networks. The nature of such an architecture, Roese predicts, will likely be open, disaggregated and software-defined for the first time.
This is a positive move for the IT industry, says Tom Canning, global vice president, devices and Internet of Things (IoT), at software firm Canonical, in an op-ed for Network Computing. Instead of the more limited and exclusive proprietary technology model run by telcos, software-defined networking seeks to decouple wireless network infrastructure from expensive, closed hardware and “shift it to an intelligent software layer running on commodity hardware”.
This “open source” approach better supports the higher speeds and lower latency of 5G, as well as the large number of endpoints in IoT. “Disruptors like Netflix, Facebook and WhatsApp almost certainly would not exist in a proprietary-only world. Imagine what an open, software-defined model will do to help IT managers meet the need for faster, more flexible and more secure systems and platforms for 5G,” Canning adds.
Dell has consequently entered the 5G space to tap this change, and started a telecom systems business unit in February 2020. Other new players from the cloud and IT world are also looking to enter the ecosystem in 2021, notes Roese. Microsoft, The Financial Times reports, has teamed up in 2020 with telcos like Verizon, Vodafone and Deutsche Telekom to launch new dedicated 5G-based networks to business customers including in the manufacturing and logistics sectors.
Trend 4: Edge proliferation
Adding to the trend of decentralised computing is the growing prevalence of edge computing. This refers to the growing trend of bringing computation and data storage closer to the devices where data is being gathered rather than having them far away in a centralised location. This is done to minimise data latency issues and save on the additional costs of centralised processing.
But Roese notes that edge is going to face a problem. With almost “everyone” building an edge — with Dell, Microsoft and Google among the more active players — the self-contained nature of these different edge computing networks presents interoperability problems between them. “If we continue on that path, customers would start to realise that without some change in the way we think about edge, we will have a proliferation of independent siloed edges,” he warns.
The Dell CTO therefore expects discussions about edge in 2021 onwards to be about edge platforms, which refer to the pool of physical capacity that can serve up computer storage networking for edges to use; as well as the definition of edge workloads — the software functions that run on these physical platforms. Companies, he predicts, will increasingly begin to specialise in either one of these functions instead of building everything in-house. Dell, he notes, has decided to cast its lot with the edge platform builders going forward.
As a result of this growing shift towards specialisation, edge experiences will no longer be running on their own particular independent edge infrastructure. Rather, from 2021 to 2022, Roese sees a shift towards building edge platforms that can run multiple edge experiences as software-defined services on them. This, he argues, will resolve the problem of edge proliferation and make edge more streamlined and cost-effective, thus accelerating adoption.
Roese also argues that with the growing prevalence of edge being delivered as platforms and extension of cloud services, more of the edge ecosystem will be delivered as a service. He notes that in 2020, Dell teamed up with FedEx and Switch — a firm that builds big data centres — to build edge co-located environments. No longer needing to be installed on-site in factories, the new edge environments instead serve factories that are located in the cities they are built in, allowing customers to quickly deploy edge capabilities without having to build them themselves.