Next-generation computing cloudfogedgequantum computing embedded with AI

Computing systems have advanced computer science over the last few decades and are now at the heart of the enterprise world, providing services based on cloud computing, fog computing, edge computing, serverless computing and quantum computing. Modern computing systems solve many real-world problems that require low latency and low response times. This helps young talents across the globe to launch startups that utilize big computing power to solve challenging problems and accelerate scientific progress.

 

Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) have grown in popularity in recent years as they have pioneered improvements in accuracy in computer vision, natural language processing and other related applications. In order to train these models, a huge amount of data has been collected in the previous years, in addition to the development of state-of-the-art computing hardware such as graphics processing units (GPUs), Google's tensor processing units (TPUs) and AI Tesla's dojo processing units (DPUs). This means that both sides benefit. Many AI/ML/DL techniques require large-scale computational power and external data sources, which may be more readily available through computational systems. It is especially important now that methods for training complex AI, ML and DL models can be implemented in parallel and in large numbers. To this end, it is foreseeable that continued interest in AI/ML/DL applications will drive new research into mature data center resource management issues including VM provisioning, consolidation, and load balancing, as well as make it easier to address scaling challenges; this may pave the way for more widespread use of AI in modern computer systems.

 

AI and ML will create important and necessary requirements for computing systems in the coming years, ranging from the storage, maintenance, and study of extremely large data streams generated by large-scale heterogeneous IoT and sensor networks, to Qos-aware (latency, energy, cost, response time) customized computational services that adapt to a range of hardware devices while maximizing their compliance with multiple criteria including software-level Qos constraints and financial constraints. Next-generation/future computing can be built with the help of AI/ML technologies that can deal with these problems quickly and efficiently.

 

In this article, we summarize the current research and potential future directions in the related areas as discussed by leading academics, researchers, practitioners, engineers, and scientists in the fields of Cloud Computing, AI/ML, and Quantum Computing.

 

 

Emerging Computing Models

 

 

Integrating the cycle of next-generation computing

 

It is becoming increasingly evident that the rise of cloud computing complements the rise of AI. As a result, the use of AI in the cloud can improve the performance, efficiency and digital transformation of the cloud. AI in cloud computing environments is key to making organizations smarter, more strategic and insightful, while also providing greater flexibility, agility and cost savings.

 

 

Artificial intelligence and cloud computing can be combined in a number of ways to enhance cloud computing. Artificial intelligence tools and software are synchronized with the power of cloud computing to provide higher value to existing cloud computing environments. This combination makes organizations smarter, more strategic and insightful; data and applications hosted on the cloud make organizations more responsive and adaptable, while also providing cost savings across the company.

 

Existing capabilities gain intelligence and customers get an excellent experience thanks to the addition of an artificial intelligence layer that helps generate insights from data. As a result, organizations can benefit from a very unique combination. The cloud is like a video game that sends out tons of operational data and telemetry, just like a Tesla electric car. Therefore, AI-based cloud computing is basically AI operations that use algorithms to make sense of all this data rather than relying on humans.

 

In the wake of the new crown, cloud investment in Q1 2020 is up 37% from Q1 2019 to $29 billion; integrating AI and cloud computing can help organizations get closer to consumers while improving operational efficiency.

 

Cloud computing environments and solutions are helping organizations become more flexible, adaptable, and cost-effective as it significantly reduces their infrastructure overhead. Artificial Intelligence can bring more freedom to organizations by handling huge data repositories, streamlining data, improving workflows, and providing real-time insights into day-to-day operations. The burden of operations may shift from processes and people to engineering and data. That's why AI is fueling cloud computing in a variety of ways. The software-as-a-service (saas) model is now being used to successfully apply cloud-based AI. saas companies are incorporating AI into their solutions to provide enhanced capabilities to customers and end users. Another way organizations are adopting AI to augment their existing cloud infrastructure is by using AI as a service: the use of AI makes applications more readable and intelligent, reducing errors and increasing throughput.

 

The cloud-native paradigm derived from cloud computing transforms traditional monolithic cloud applications into lightweight, loosely coupled and internally granular microservices. This paradigm supports applications to be updated in a more frugal manner. However, efficient management of microservices can be challenging due to their increased volume and time-sensitivity. AI/ML-based solutions can address some of these challenges, e.g., neural network-based approaches can be used to predict the workload of microservices, and ML-based techniques can be used to analyze microservice dependencies.

 

Here are various advantages of deploying AI in the cloud:

 

- Enhanced Data Management: In today's data-driven world, data is king and hence better ways of handling it are required. The ability of organizations to track data is a major hurdle. Cloud-based AI tools and applications can identify, update, catalog and provide real-time data insights to customers. AI technology can also be used to detect fraudulent activity and identify anomalous system trends. In today's high-risk environment, banks and other financial institutions rely heavily on this technology to remain competitive and secure.

 

- Automation: thanks to the combination of AI and cloud technology, intelligent automation can now be implemented across the business, removing barriers. Artificial intelligence improves predictive capabilities as algorithmic models utilize historical data and other patterns to provide instant insights. Artificial intelligence and cloud solutions can help organizations achieve cognitive automation from semi-structured to unstructured documents, while also driving effective infrastructure management that reduces downtime and impact. As a result, the cost of doing business changes and so does the customer experience.

 

- Cost savings: Cloud computing enables organizations to pay only for the resources they use. This can result in significant savings compared to the typical infrastructure expenditures of building and maintaining large-scale data centers. The savings can be used to build more strategic AI tools and gas pedals, which can then be used to increase revenue and fundamentally save the company money. This leads to better quality of operations and lower costs.

 

The following issues may also arise when these two technologies are combined:

 

- Integration: It is never easy to achieve seamless integration of two different technologies. In order to achieve integration, companies must first move all applications and technologies to the cloud. This is not an easy task for many companies.

 

- Insufficient data: Large data sets with high-quality data are ideal for AI technologies. Companies must ensure that their data is both accessible and clean and tidy so that AI can work. But since data is often disorganized or missing, this is a big challenge; the value of the solution must come from high-quality data.

 

- Security and privacy concerns: to prevent data breaches, organizations must be vigilant in protecting sensitive and financial information from being stolen by adversaries, who are likely to target it.

 

To synchronize AI with the cloud, companies will have to invest a lot of knowledge, resources and money. Only when cloud computing and AI systems are properly integrated can businesses utilize a variety of powerful machine learning capabilities, such as image recognition and natural language processing. As a result, more businesses will follow suit in the future. Organizations need AI clouds to keep up with the rapid growth of cloud computing. After successful implementation, AI operations will eventually become the standard approach to cloud management.

 

Today, the cloud is already a powerful technology, but scientists believe that AI will make it even more powerful. With this combination, there will be a fundamental shift in data analysis and management. The combination of AI and the cloud will be a game changer and will deliver unparalleled value to end users in a world awash with massive amounts of data. Now that cloud computing and AI are being used in a much wider range of applications, they are making waves across industries globally; it is clear that AI promises to help companies tackle new and more obvious challenges and open up new horizons for their potential customers.

 

Fog computing was created to complement cloud computing services because of the increasing use of IoT and the need to process and store the massive amounts of data generated. IoT applications that require minimal response time can be supported by fog computing that provides basic network services.

 

 

Due to the decentralized, heterogeneous, and resource-constrained nature of fog, it is very difficult to efficiently distribute IoT application activities within fog nodes to meet Qos and Quality of Experience (QoE) constraints. Vehicle-to-object (v2x), health monitoring, and industrial automation employ fog computing because it provides computing power close to the user to meet the reaction time expectations of these applications. As a result, these applications derive a great deal of information from the widespread use of IoT devices.

 

Cloud computing cannot meet the latency requirements due to long distance data transmission latency and network congestion. It provides a network of gateways, routers and computing nodes between the data source and the cloud computing center. Fog computing extends cloud computing due to low latency, low energy consumption and reduced bandwidth required for data transmission. Fog nodes can be used to process sensitive data instead of sending it to the cloud, thus increasing security. Utilizing data generated by various IoT devices, these applications aim to provide useful information while addressing latency issues.

 

In recent years, researchers have increasingly turned to artificial intelligence to help them analyze large amounts of data for these purposes. The machine learning (ML) and deep learning (DL) subfields of AI provide useful data insights and decision-making assistance.

 

For IoT, 5G means more than just a new era of wireless innovation. From the data center to the edge of the network, more than a trillion sensors, gadgets and machines are powered by AI and operate autonomously. Fog and edge computing are two of the best technologies for accelerating data analysis and decision making. Many "fog devices" will be networked and share locations as part of a distributed computing system called fog computing. Edge management, data collection, monitoring, analytics, and streaming all take place on fog computing nodes at the edge of the network. While fog computing is capable of connecting a limited number of devices, the technology is much more capable of handling real-time requests and aggregating data from a wider range of sources.

 

Since the introduction of Amazon Cloud Computing in 2006, we have been able to access all types of resources, have scalable architectures, and use them anywhere at the push of a button. Cisco claimed in 2008 that IoT was one of the technologies that would benefit from cloud computing, but its roots go back to 1999. For example, we can save and act on sensor data, use artificial intelligence to automate processes, and react in real time to situations that previously required direct involvement. When the IoT was first introduced, it promised to expand into the professional and personal realms, and sensorization and communication protocols had to change to meet these new demands. The integration of sensor data and the use of artificial intelligence have given rise to new paradigms.

 

More broadly, the term "smart city" is used to describe cities, while "smart factory" is used to describe manufacturing and processing facilities. One thing they have in common is the use of data and automated decision-making combined with automation, which can be easily accomplished through AI and ML technologies.

 

Examples include changing the configuration of a computer or railroad, applying the brakes on a self-driving car, or issuing a preventive maintenance warning. From these examples, it is clear that decisions and actions cannot be done in the cloud, but on devices that are closer to the sensors collecting the data.

 

Research on application deployment is already underway in a number of areas, including industry and manufacturing, but there are still many issues that need to be addressed:

 

- Execution time: time is the most pressing issue for both service providers and customers. One of the main motivations for placing software in the fog is to speed up user response time. In applying the placement problem, the deep learning algorithm class of time performance metrics optimization techniques are applied. If we use different machine learning algorithms and evolutionary algorithms or novel combinations of techniques, we might be able to achieve better results.

 

- Mobility Awareness: we need migration methods and architectures that can handle a wide range of mobile activities.

 

- Resource Scheduling: Managing resources in dynamic environments like fog is a problem because of the limited amount of resources in such environments and the short response time of users. Fog environments are less variable than cloud environments in terms of resource sharing. Therefore, the problem of low resource utilization must still be addressed. Resource allocation is based on a survey of existing research and the use of neural networks, support vector machines and k-nearest neighbors (KNN).

 

- Energy efficiency: idle fog nodes may be shut down if provisioning strategies and algorithms are improved, and energy consumption can be avoided by combining Qos and QoE since application modules are located in decentralized fog nodes. Energy consumption and cost are affected by memory, CPu and bandwidth usage and can be predicted by machine learning methods such as K-means, KNNs, logistic regression, branch-and-bound, deep Q-networks (DQN) and sARsA.

 

- SECURITY AND PRIVACY: Fog infrastructures are essential to determine the security of applications due to security issues such as information degradation, identity leakage, replay and denial of service attacks. Authentication, encryption and data integration need to be implemented in dynamic computing environments due to lack of user control over their information.

- Fault Tolerance and Availability: one of the main reasons for developing fog computing is to improve reliability. Issues such as sensor failures, lack of access network coverage in a specific area or throughout the network, service platform failures, and user interface system connectivity outages are all part of the equation in fog computing. Another challenge in fog environments is to improve application availability. The heuristic for improving service availability and Qosisto is to map applications to fog communities and then temporarily place their services on fog devices, communities based on service placement issues.

 

Image, video, natural language processing (NLP) and robotics are fog computing applications that have only recently begun to emerge. Image placement and processing for fog computing is one of the most widely used areas of Artificial Intelligence in research and industry, where the goal is to distinguish objects or people from each other and to be able to categorize and identify photos based on image processing algorithms. The use of fog computing in image processing based applications reduces the response time and improves the quality of service. In connection with medical applications that require image processing accuracy and fast processing of medical data, it may be beneficial to use effective scheduling algorithms for deployment in fog environments. According to the literature, deep learning algorithms such as Convolutional Neural Networks (CNN) and Generative Adversarial Networks (GAN) can be used in the field of image processing in fog.

 

Another area of interest in this field is NLP. Sound processing and recognition requires cloud and fog environments to store data. Deep learning methods with sound imitation may be useful for security reasons. For example, in a smart home application scenario, processing and recognizing conversations between a homeowner and an outsider should be done discreetly and quickly, so efficient scheduling methods for placing NLP applications in fog will be proposed. Technologies in the field of deep learning may be helpful in this regard. Industry, trade, agriculture, and health can all benefit greatly from robotics, making robotics an important issue to discuss. In situations where judgments must be made quickly, it may be acceptable to use fog environments that employ machine learning processing and decision making. In order for robots to communicate data, they need a fog environment that can respond quickly to their commands. The Internet of Things defines each robot as an object that can interact with other IoT things and other robots .

 

Therefore, scientists can utilize artificial intelligence to improve Qos and QoE.

 

 

Distributed computing has evolved from content delivery networks to a generally accepted and commonly used edge computing model that brings processing and data storage closer to the end user's location. Instant data created by and only for users requires edge computing and storage, but big data will always require cloud-based storage. As customers spend more and more time on mobile devices, organizations are recognizing that they need to move critical computing to the device in order to serve more customers.

 

The edge computing market is thus witnessing growth opportunities. By 2023, the market is expected to reach $1.12 trillion. According to Gartner, 74% of data will need to be processed at the edge in 2022, whereas 91% of data is currently processed in centralized systems.

 

Customers are more concerned about their privacy and want to know how and where their data is acquired and maintained. After completing the authentication process for their apps, many companies are serving their customers by offering apps with AI customization features. Customers typically access AI-enabled gadgets using speakers, phones, tablets, and bots. Due to the sensitive and personal nature of the data, multi-level encryption and dynamic encryption processes are required. Edge nodes help in building highly distributed architectures and help in establishing appropriate security policies for each device. Since the services are dispersed at both network and device levels, there is a problem of latency in sending data across networks and devices. Due to this latency, work must be done over the network. Multiple load-balancing endpoints are required when applications must have end-to-end resiliency and a widely distributed architecture. Data computing services are closer to the mobile device or the edge (called "cloudlets"), which improves device-level elasticity.

 

We have to think, how can we leverage AI to overcome these challenges?

 

Edge computing is a key enabler for AI, delivering high quality performance at low cost. This is the best way to understand the connection between AI and edge computing. We can benefit from the combination of AI with cutting-edge computers. Due to the data and compute intensive nature of AI, edge computing helps AI applications to overcome the technical problems of AI applications. Artificial intelligence and machine learning systems need to assimilate large amounts of data in order to spot patterns and provide reliable recommendations. In AI use cases that require video analytics, cloud-based HD video data streaming can lead to latency issues and increased costs because of the huge bandwidth used. Latency and reliance on centralized processing in the cloud is detrimental when AI triggers, decisions, and actions must occur in real time. Processing and decision-making can take place at the data source, which means action can be taken at the edge, avoiding backhaul costs and making the edge an ideal location for data processing. Streaming data to the cloud includes only the most important information and data sets, leaving the rest of the data in the cloud.

 

The decentralized and complex nature of edge computing networks poses a number of problems in terms of infrastructure management. In order to manage resources effectively, a number of activities must be accomplished. These activities include workload estimation, task scheduling, and virtual machine consolidation, resource optimization, and energy conservation. In dynamic, fast-changing environments and real-time scenarios, traditional predefined rules (mainly based on operations research methods) have been used in the past for resource management. Artificial intelligence-based techniques are increasingly being used to solve these problems, especially when decisions have to be made. In recent years, various approaches including AI, ML and DL have been widely used. To further improve user mobility, the cache must be able to predict where the user is going. To reduce expenses and energy consumption, the cache should be located on appropriate edge servers. Reinforcement learning, neural network modeling and genetic algorithms are some of the approaches.

 

In business and industry, the benefits of edge technologies have spread rapidly, especially reducing the growing expenses of IT equipment on cloud and network bandwidth. All the activities of a company take place all over the world. Despite the volume of data in the cloud and big data centers, only 1% of monitored data is estimated to be relevant to business insights such as anomaly identification or future event prediction. The edge delivers high-quality business services through local processing, analytics, and local devices. This is where the edge's operational efficiency and significance comes into play, as it avoids transmitting terabytes of unnecessary data to the cloud/data center and only delivers relevant and actionable data to the end user. Innovative applications of edge computing capabilities are proliferating. Edge computing still has problems migrating to the "last mile" of decentralized networks, but new use cases in industrial applications are showing strong convergence with AI, especially for enterprise value. Augmented reality, virtual reality and mixed reality are becoming increasingly popular in the automotive, construction, process and manufacturing industries. This requires a computing infrastructure that is scalable, adaptable, responsive and always available. There are tons of uses for AI and machine learning in edge computing. Smart retail, contact centers, security and legal assistants can all benefit from NLP capabilities such as parsing the human voice, recognizing handwriting and categorizing text.

 

Edge computing environments have significantly different issues compared to cloud, fog or serverless computing. As a result, edge environments are plagued by scalability and performance-related issues, especially when dealing with mission-critical data and applications. Given the scale issues, it is extremely difficult to track the health and status of every IT component, especially when there are so many remote edge locations to track, let alone visualize and analyze their impact on other linked devices. Highly decentralized and diverse network edge setup. The problem is compounded by the fact that the nature of infrastructure components varies, and the cost of acquiring the various skills and resources required is high, creating "edge silos". Intelligent software based on artificial intelligence is essential to deal with extremely decentralized and heterogeneous edge environments. With automated security and response, it is also possible to secure the entire customer without engaging the customer.

 

In addition, real-time performance management between endpoints such as consumers and cloud/data centers is a key issue. End-to-end views and data repositories at all sites are supported by technology tools that continuously monitor performance metrics and low data volumes. In the event that an edge device fails or becomes unavailable, the edge infrastructure has built-in redundancy to isolate, repair, and maintain acceptable levels of operation. Certain uncertainties will be encountered if an operational edge data center has multiple teams responsible for different parts of the infrastructure. In this case, advanced correlation and analytics based on artificial intelligence is very useful to examine, integrate and unify data from multiple sources, transforming it into information and communicating it to the relevant roles in the team. The information provided can be used to help automate processes.

 

Private companies, including spacex and Amazon, are now building low earth orbit (LEO) satellites to provide global broadband internet access. As the number of users of LEO satellite access networks increases, it is critical to determine if and how edge computing principles can be implemented in LEO satellite networks.

 

Modern computing systems use edge devices as an integral part of the data center, so specific IoT-based applications need to be created to provide more encrypted transmissions and protect data privacy. We can improve the capabilities related to edge computing in the following ways:

 

- Due to the resource constraints of IoT edge devices, they are unable to run robust security software and firewalls built for desktop computers, and therefore must use blockchain technology to enhance security using AI/ML. In addition, innovative software architectures, such as those that facilitate patching and maintenance of IoT devices, can be further enhanced by utilizing AI and ML.

 

- Automated decision-making based on AI/ML, rather than human-coded heuristics, offers a profitable path to optimize edge systems with massive amounts of data through engineering speed and efficiency.

 

- AI-based big data analytics approaches are needed to process edge device data in IoT applications at runtime.

 

Serverless computing is becoming increasingly popular when designing cloud-native applications.

 

Serverless computing is a cloud computing paradigm that abstracts the operational aspects of management. Since developers no longer need to worry about maintaining infrastructure, serverless computing is likely to scale at a fairly rapid rate. Because of this, cloud service providers can utilize serverless computing to more easily manage infrastructure and automate provisioning. The time and resources required for infrastructure management will be reduced as a result.

 

Because AI is the future of technology, all platforms are embracing it. On the other hand, serverless design can solve most of the problems encountered by developers. With serverless architecture, machine learning models can be handled properly and resources can be managed efficiently. Thanks to this design, developers can devote more time and resources to the training of AI models rather than the management of server infrastructure.

 

Complex challenges often require the development of machine learning systems. These systems require analyzing and preprocessing data, training models, and tuning AI models, among other things. Therefore, the application program interface should be able to run smoothly. Serverless computing and AI should be used to ensure uninterrupted data storage and information delivery. Serverless architectures offer a variety of options and advantages from which machine learning models can benefit greatly. When any number of requests are received, the infrastructure provider accurately allocates its own computational execution capacity.

 

The AI/ML integrated serverless architecture offers the following benefits:

 

- Fair Pricing: The serverless design enables execution-based pricing, so you only pay for the services you actually use. As a result, pricing models are more readable and costs are significantly reduced.

 

- Work independently: Serverless computing enables development teams to run autonomously with little intervention and latency. As a result, the model is treated as a distinct function. Calling the function does not affect the rest of the system and can be done at any time.

 

- Auto-scaling: When using the auto-scaling feature, there is no longer a need to store predictions, as developers can make changes at any time.

 

- Pay-per-use: With a new model called "pay-per-use," customers only pay when they actually use the resources. Instead of paying for the number of servers, serverless computing is pay-per-service. Combined with the "zero-scaling" nature of serverless, users pay only for the number of executions and the number of hours the resource is used.

 

- Hassle-free server management: Serverless computing provides a back-end service that can be accessed only when needed, freeing users from the burden of managing servers. With a serverless back-end, service providers don't have to modify settings if they want to increase or decrease the amount of bandwidth they reserve or pay for. Before the advent of the Internet, it was both difficult and expensive for web developers to have the hardware needed to run servers.

 

- High Availability: Serverless solutions are becoming increasingly popular because of their built-in availability and fault tolerance. As a result, there is no need to build services that can provide these features to applications because they are always available.

 

Quantum computing promises to be a technology that has the potential to fundamentally change artificial intelligence. With AI having a huge impact globally, the combination with quantum computing could have a multiplier effect, triggering a revolutionary impact on AI.

 

Quantum computing employs a novel approach to data and information processing: information encoded in the quantum state of a quantum system is processed according to the laws of quantum mechanics, opening up a number of opportunities not available with classical information processing. Examples include quantum superposition and quantum entanglement. Quantum entanglement is a property of quantum systems that limits the amount of partial information about the global quantum state that may be available to an observer, making it impossible to provide a complete description based on knowledge of only some of the states.

 

The above characteristics of quantum systems stimulate the power of quantum computation in one way (if the interaction with the environment can be effectively shielded), but they are also the main limiting factor that prevents current computer systems from scientifically simulating quantum systems, even with artificially intelligent supercomputers. Indeed, the phase space scale of the evolution of composite quantum systems grows exponentially with the number of constituent systems.

 

The unit of information used in quantum computers is the quantum bit, which replaces the bit used in classical computers. The state of a quantum bit Ψ can be an atom, a photon, a circuit, etc., and can be represented mathematically as a vector in complex Hilbert space.

 

The simulation of quantum systems has been the initial impetus for the construction of quantum computers, but it was not until the discovery of quantum algorithms capable of realizing practical goals that more and more attention began to be paid to the construction of such devices. Following the seminal work that formalized the concept of a quantum computer, several more algorithms have been made available to achieve tasks that are difficult for classical computers to accomplish. the discovery of the shor algorithm, which provides an easy solution to factorization of large numbers, has been of great importance to cryptanalysis and has spurred research in quantum computing and quantum cryptography. However, efficient operation of the shor algorithm on working quantum hardware requires a certain level of precision in implementing register initialization, quantum operations on multiple quantum bits, and quantum state storage, which has not yet been achieved by current state-of-the-art devices.

 

It is worth noting that quantum computers have their own limitations. For example, it is expected that quantum computers cannot effectively solve NP-hard optimization problems, and in terms of search, the speedup brought by quantum computers is quadratic in relation to the time required by classical computers (grover's algorithm).

 

Indeed, building a quantum computer is no easy task: as experimentalists are well aware, the advantages of quantum computing brought about by properties such as quantum superposition and entanglement tend to disappear at an exponential rate as the size and complexity of the hardware (i.e. the number of quantum systems involved) increases. Nonetheless, in recent years, interest in quantum computing has grown among major high-tech companies (IBM, Microsoft, Google, Amazon, Intel, Honeywell), and many younger companies have come up with quantum computing solutions using a variety of core technologies, including superconducting devices, trapped ions, and integrated optical circuits. These are just some of the many companies that are currently funding quantum programs and are interested in developing this technology.

 

However, as mentioned earlier, the challenge of maintaining the subtle properties of composite quantum states depends on the ability to isolate these devices from the external environment so that coherent quantum evolution can be achieved in the presence of even tiny amounts of noise. As a result, these devices require ultra-low temperatures of zero-point-few kelvins, which also poses a challenge in designing appropriate materials that can operate well at such low temperatures.

 

While general-purpose quantum computers remain a long-term challenge for quantum computing, noise-containing intermediate-scale quantum (NISQ) devices are a foreseeable target in the coming years. Physicists can use such devices to begin to efficiently simulate complex composite quantum systems and to study exotic quantum states that have not yet been touched by physics laboratories.

 

The next step, once NISQ devices are reliable and fully developed, is that we will need to support the main computational unit with effective quantum error correction (QEC) circuits to overcome the limitations imposed by the noise present in the computational process. This may open the way for fault-tolerant quantum computing - which needs to involve thousands or even more quantum bits. In fact, QEC requires a large number of quantum bits and logic gates to implement. While research is focused on improving the performance of quantum devices and optimizing quantum computing, many entrepreneurs are also interested in producing quantum software solutions. As a result, it is expected that many investors will invest in startups centered around quantum computing technology, and from this perspective, interest in quantum computing is likely to increase.

 

Pharmaceutical investors' interest in quantum computing has already ignited. Many industries can benefit from quantum computers and business solutions: finance, healthcare, genetics, pharmacology, transportation, sustainability and cybersecurity are all direct beneficiaries of quantum computing. The potential of quantum computing has been recognized by the banking industry. Financial analysts often use quantum computing models that incorporate probabilities and assumptions about how markets and portfolios operate. To this end, quantum computers can process data faster, run better forward-looking models, and balance conjugate options more accurately. They can also help solve complex optimization problems, such as portfolio risk optimization and fraud detection.

 

According to a study published by IBM, quantum algorithms in the company's cloud computing platform outperform classical Monte Carlo simulations; quantum solutions hold great promise in healthcare; quantum computing can improve personalized medicine by enabling faster genomic analysis and personalized treatment strategies for each patient . Genetic research generates large amounts of data. Therefore, analyzing DNA information requires a lot of processing power. Currently, some companies are reducing the cost and resources required to sequence the human genome; however, a powerful quantum computer may significantly speed up the screening of data, thus making genome sequencing more affordable and scalable.

 

Another area where quantum computing may benefit drug development is protein folding. This may help speed up drug discovery efforts by making it easier to predict the effects of pharmacological compounds.

 

Public key cryptosystems are the foundation of today's communications.Rivest-shamir-Adleman (RSA) encryption is actually the most commonly used cryptosystem for securing data transmission over networks, and its mechanism of operation and security requires factorization of large prime numbers, which is beyond the limits of the current capabilities of classical computing. However, as mentioned above, the quantum computational power of utilizing shor factorization algorithms may render this encryption model obsolete. As a result, research and investment in building secure cryptosystems in the era of quantum computing has increased over the past few decades, and projections for the coming years suggest that this interest will grow, with Toshiba, for example, targeting $3 billion in quantum cryptography revenues by 2030.2 In addition, there is a growing interest in the use of quantum cryptography for the development of secure cryptographic systems in the era of quantum computation. Meanwhile, alongside efforts to design and implement effective quantum key distribution (QKD) protocols, the National Institute of Standards and Technology (NIST) has issued proposals for post-quantum encryption standards: it has begun to apply for, evaluate, and standardize one or more public-key encryption algorithms that can withstand eavesdropping on quantum hardware.

 

There have also been proposals to apply quantum computers to the environment, with the hope that quantum computing will open new avenues for combating the climate crisis by identifying and optimizing processes that can help combat global warming and other climate change impacts.

 

 

Quantum artificial intelligence illustration

 

Quantum computing is more efficient than classical computing in managing large amounts of data. Quantum algorithms are mathematical algorithms executed on a realistic model of quantum computing; quantum computing circuits are the most commonly used model. The state of a quantum computing system can be considered as the information encoded in the physical quantum states that support a particular realization. Quantum information theory is based on these fundamental objects: quantum bits (units of information), quantum gates (devices that operate on quantum bits), and quantum channels that connect gates and circuits to maintain quantum superposition and entanglement. Quantum computers are capable of handling several times the amount of data as conventional computers because they inherently contain and manage the tensor product structure of composite quantum systems.

 

It is still very difficult to provide a foreseeable future for the development of quantum dominance and to know exactly how and when we will get full and deep applications of quantum dominance; however, it is reasonable to say that the exponential acceleration of quantum computation will involve all the problems that require the management of large amounts of data, such as pattern recognition or the training of machine learning models.

 

The effectiveness of the training of machine learning models, like reinforcement learning, depends to a large extent on how fast the agents learn to interact with their environment: they interact, get some feedback, and then adapt (learn) their behavior based on the feedback received.

 

Biometrics and autonomous driving are two important examples of workloads that can be processed using QAI. The fact that quantum computers can process more data in less time than conventional computers sheds light on the concept of QAI, which involves combining quantum computing with artificial intelligence to achieve superior performance over classical AI. Reinforcement learning (RL) is a well-established branch of AI that aims to maximize rewards through trial and error by agents; it is safe to assume that combining reinforcement learning with quantum computing will lead to huge advances in computing systems. As quantum computers accelerate machine learning, the potential impact is bound to be enormous.

 

The ability of quantum computing to perform tasks quickly may help AI systems in applications such as problems related to autonomous driving, natural language processing (NLP) algorithms, and in tasks that are extremely time-consuming and expensive with traditional methods. Achieving "meaning perception" is a goal of quantum algorithms. In order to create real-time speech patterns, these algorithms can use phrases and paragraphs. Notably, predictive analytics is a key application and business use case for AI. Massive amounts of data can be used to train AI systems that specialize in machine learning and deep learning. However, complex and ambiguous problems such as stock market forecasting and climate change control systems require the creation of unique data using quantum principles of entanglement and superposition.

 

New discoveries in AI algorithms for quantum computers or quantum artificial intelligence (QAI) are expected to lead to key breakthroughs needed to advance climate change science. Improvements in weather and climate prediction from this research are expected to have a cascading effect on a wide range of socio-economic advantages. For example, the National Aeronautics and Space Administration (NASA) has established the Quantum Artificial Intelligence Laboratory (QuAIL), which is dedicated to investigating the possibility of using quantum computers and algorithms to solve machine learning problems on NASA missions.

 

Thanks to quantum computing, nanotechnology and nanoscience may also be integrated into AI for ultra-small microscopic devices at the molecular, atomic, and subatomic levels. Quantum physics in nanotechnology.

 

Machine learning applications for quantum devices are already in development, with the hope of utilizing quantum computing to accelerate the training of machine learning models and produce more scientific learning algorithms. Even before full-scale quantum computing solutions are ready, machine learning and artificial intelligence are likely to benefit from improvements in quantum computing technology. As a result, it is expected that the field of quantum machine learning (QML) will emerge, closely followed by Adaptive Quantum Machine Learning, which will be able to adaptively self-learn using quantum computing.

 

While quantum computing is still in its infancy, from a business and economic perspective, now is the perfect time for startups to join the path. The future of our economy will not be determined by cryptocurrencies, but by quantum computing solutions.

2023-08-09