国产av日韩一区二区三区精品,成人性爱视频在线观看,国产,欧美,日韩,一区,www.成色av久久成人,2222eeee成人天堂

Home Technology peripherals AI Can Quantum-Inspired AI Compete With Today's Large Language Models?

Can Quantum-Inspired AI Compete With Today's Large Language Models?

Apr 20, 2025 am 11:18 AM

Can Quantum-Inspired AI Compete With Today’s Large Language Models?

Dynex, a company from Liechtenstein, recently launched its Quantum Diffusion Large Language Model (qdLLM) in its SXSW 2025 Innovation Award final, becoming a compelling development. The company claims its qdLLM is able to generate generative AI output faster and more efficiently than traditional Transformer-based systems that rely on current technology infrastructure.

How does this compare to other emerging approaches? What does this mean for the broader future of AI?

The significance of quantum computing to AI

The core difference of quantum computing is that it uses qubits, which can exist in multiple states at the same time due to quantum superposition. This allows quantum computers to evaluate a large number of potential solutions in parallel, which may have advantages in tasks such as large-scale optimization, simulation, or pattern recognition.

In the field of AI, researchers have explored how quantum features can improve tasks such as natural language processing, machine learning optimization, and model training efficiency. However, most of these efforts are still in their early stages. For example, IBM and MIT have studied how hybrid quantum classical models reduce training time for specific deep learning tasks, while startups such as Zapata AI are experimenting with quantum enhancement models for sentiment analysis and prediction.

In this context, Dynex's approach introduces a new architecture that uses quantum heuristic algorithms to run LLM more efficiently through decentralized hardware.

Dynex's qdLLM: A diffusion-based parallel approach

Unlike Transformer-based models that use autoregression techniques to generate one tag at a time, Dynex's qdLLM is built on a diffusion model that creates output tags in parallel. According to Dynex, this approach is more computationally efficient and produces better contextual consistency.

“Traditional models like GPT-4 or DeepSeek work sequentially, word after word,” said Daniela Herrmann, Dynex co-founder and task leader at Dynex Moonshots. "qdLLM works in parallel. It thinks more like the human brain, processing all patterns at once. That's the power of quantum."

Several academic projects including Stanford University and Google DeepMind, as well as initiatives from major AI technology providers, have recently begun exploring the diffusion-based Transformer.

Dynex further differentiates itself by integrating quantum annealing, a quantum optimization form, to improve mark selection during text generation. This increases consistency and reduces computational overhead compared to traditional LLMs, the company claims.

Decentralized and analog quantum hardware

One unique feature of the Dynex model is that it relies on a decentralized GPU network that simulates quantum behavior rather than requiring access to actual quantum hardware. This design allows the system to scale to up to one million algorithmic qubits described by Dynex.

"Any quantum algorithm, such as qdLLM, is being computed on the decentralized network of the GPU, which effectively simulate quantum computing," Herrmann explained.

This type of simulation has some similarities with the work of TensorFlow Quantum (Google and X) which also simulates quantum circuits on classic hardware to create algorithm prototypes. Similarly, many tech startups and vendors are developing platforms to simulate quantum logic at scale before physical hardware is ready.

In addition to software, Dynex plans to launch its own neuromorphic quantum chip Apollo in 2025. Unlike superconducting quantum chips that require low temperature cooling, Apollo is designed to operate at room temperature and supports integration into edge devices.

"Using neuromorphic circuits allows Dynex to simulate quantum computing at scale, up to 1 million algorithmic qubits," Herrmann explained. “Dynex will start producing actual quantum chips that are also based on neuromorphic paradigms.”

Quantum impact on AI efficiency and environmental impact

Dynex says qdLLM achieves 90% smaller model sizes, 10 times faster, and uses only 10% of the GPU resources typically used for equivalent tasks. These are important statements, especially given the increasing concern about AI energy consumption.

"The efficiency and parallelism of quantum algorithms reduce energy consumption because it is 10 times faster and requires only 10% of the number of GPUs," Herrmann said.

While independent verification is still required, Dynex's approach echoes the efforts of Cerebras Systems, which has created wafer-level chips that use less energy for training tasks. Another example is Graphcore, whose Intelligent Processing Unit (IPU) is designed to reduce the energy footprint of AI workloads through a dedicated parallel architecture.

Dynex reports that qdLLM performs strongly in benchmarks requiring strong inference, outperforming leading models, including ChatGPT and Grok. While public benchmark data has not been released yet, the company said it will release a comparative study as it is closer to the 2025 market launch. Dynex's performance assertions remain anecdotal, but interesting until it is provided with peer-reviewed benchmarks.

“We publish qdLLM benchmarks regularly and have proven that certain questions that require strong reasoning cannot be answered correctly by ChatGPT, Grok or DeepSeek,” Herrmann noted.

A bigger picture: How will quantum affect AI?

In the long run, Dynex believes that quantum computing will become the core of the AI ??field.

"We think quantum will dominate AI for the next five years," Herrmann said.

This prediction remains speculative, although not without precedent. Analysts at McKinsey, Boston Consulting Group and Gartner all point out that quantum computing can greatly improve optimization and simulation tasks, but for most use cases, it may not be possible until after 2030. A more cautious view suggests that quantum-AI hybrids will first appear in niche applications such as drug discovery, financial risk modeling, or cybersecurity.

Currently, Dynex is in a growing field that is experimenting with quantum augmentation or quantum heuristic AI methods. Whether their decentralized, diffusion-based qdLLM can surpass benchmarks remains to be seen, but its emergence suggests that searching for new foundations of AI is far from over.

The above is the detailed content of Can Quantum-Inspired AI Compete With Today's Large Language Models?. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undress AI Tool

Undress AI Tool

Undress images for free

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Top 7 NotebookLM Alternatives Top 7 NotebookLM Alternatives Jun 17, 2025 pm 04:32 PM

Google’s NotebookLM is a smart AI note-taking tool powered by Gemini 2.5, which excels at summarizing documents. However, it still has limitations in tool use, like source caps, cloud dependence, and the recent “Discover” feature

From Adoption To Advantage: 10 Trends Shaping Enterprise LLMs In 2025 From Adoption To Advantage: 10 Trends Shaping Enterprise LLMs In 2025 Jun 20, 2025 am 11:13 AM

Here are ten compelling trends reshaping the enterprise AI landscape.Rising Financial Commitment to LLMsOrganizations are significantly increasing their investments in LLMs, with 72% expecting their spending to rise this year. Currently, nearly 40% a

AI Investor Stuck At A Standstill? 3 Strategic Paths To Buy, Build, Or Partner With AI Vendors AI Investor Stuck At A Standstill? 3 Strategic Paths To Buy, Build, Or Partner With AI Vendors Jul 02, 2025 am 11:13 AM

Investing is booming, but capital alone isn’t enough. With valuations rising and distinctiveness fading, investors in AI-focused venture funds must make a key decision: Buy, build, or partner to gain an edge? Here’s how to evaluate each option—and pr

The Unstoppable Growth Of Generative AI (AI Outlook Part 1) The Unstoppable Growth Of Generative AI (AI Outlook Part 1) Jun 21, 2025 am 11:11 AM

Disclosure: My company, Tirias Research, has consulted for IBM, Nvidia, and other companies mentioned in this article.Growth driversThe surge in generative AI adoption was more dramatic than even the most optimistic projections could predict. Then, a

These Startups Are Helping Businesses Show Up In AI Search Summaries These Startups Are Helping Businesses Show Up In AI Search Summaries Jun 20, 2025 am 11:16 AM

Those days are numbered, thanks to AI. Search traffic for businesses like travel site Kayak and edtech company Chegg is declining, partly because 60% of searches on sites like Google aren’t resulting in users clicking any links, according to one stud

New Gallup Report: AI Culture Readiness Demands New Mindsets New Gallup Report: AI Culture Readiness Demands New Mindsets Jun 19, 2025 am 11:16 AM

The gap between widespread adoption and emotional preparedness reveals something essential about how humans are engaging with their growing array of digital companions. We are entering a phase of coexistence where algorithms weave into our daily live

AGI And AI Superintelligence Are Going To Sharply Hit The Human Ceiling Assumption Barrier AGI And AI Superintelligence Are Going To Sharply Hit The Human Ceiling Assumption Barrier Jul 04, 2025 am 11:10 AM

Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). Heading Toward AGI And

Cisco Charts Its Agentic AI Journey At Cisco Live U.S. 2025 Cisco Charts Its Agentic AI Journey At Cisco Live U.S. 2025 Jun 19, 2025 am 11:10 AM

Let’s take a closer look at what I found most significant — and how Cisco might build upon its current efforts to further realize its ambitions.(Note: Cisco is an advisory client of my firm, Moor Insights & Strategy.)Focusing On Agentic AI And Cu

See all articles