Search company, investor...

Founded Year

2019

Stage

Series B | Alive

Total Raised

$154M

Last Raised

$110M | 1 yr ago

Revenue

$0000 

Mosaic Score
The Mosaic Score is an algorithm that measures the overall financial health and market potential of private companies.

+25 points in the past 30 days

About D-Matrix

d-Matrix engages in data center artificial intelligence (AI) inferencing using in-memory computing (IMC) techniques with chipset-level scale-out interconnects. It builds and deploys a mixed-signal DSP in a full-stack AI solution for a broad class of inferencing workloads in the cloud and infrastructure edge markets. The company was founded in 2019 and is based in Santa Clara, California.

Headquarters Location

5201 Great America Pkwy Suite 300

Santa Clara, California, 95054,

United States

Loading...

ESPs containing D-Matrix

The ESP matrix leverages data and analyst insight to identify and rank leading companies in a given technology landscape.

EXECUTION STRENGTH ➡MARKET STRENGTH ➡LEADERHIGHFLIEROUTPERFORMERCHALLENGER
Enterprise Tech / Data Management

The compute-in-memory (CIM) market (also called processor-in-memory) combines data storage and computational processing within the same memory unit. Instead of the traditional model where data is transferred back and forth between separate processing and storage units, CIM performs computations where the data is stored. This can significantly reduce data movement, increasing processing speed and e…

D-Matrix named as Highflier among 15 other companies, including IBM, SK hynix, and Microchip Technology.

D-Matrix's Products & Differentiators

    d-Matrix Corsair

    d-Matrix Corsair is an AI inference accelerator platform that runs Generative AI inference significantly faster with better Perf-TCO compared to alternative solutions

Loading...

Research containing D-Matrix

Get data-driven expert analysis from the CB Insights Intelligence Unit.

CB Insights Intelligence Analysts have mentioned D-Matrix in 2 CB Insights research briefs, most recently on Sep 13, 2024.

Expert Collections containing D-Matrix

Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.

D-Matrix is included in 3 Expert Collections, including Artificial Intelligence.

A

Artificial Intelligence

14,767 items

Companies developing artificial intelligence solutions, including cross-industry applications, industry-specific products, and AI infrastructure solutions.

S

Semiconductors, Chips, and Advanced Electronics

7,204 items

Companies in the semiconductors & HPC space, including integrated device manufacturers (IDMs), fabless firms, semiconductor production equipment manufacturers, electronic design automation (EDA), advanced semiconductor material companies, and more

G

Generative AI

863 items

Companies working on generative AI applications and infrastructure.

D-Matrix Patents

D-Matrix has filed 8 patents.

The 3 most popular patent topics include:

  • artificial neural networks
  • instruction processing
  • parallel computing
patents chart

Application Date

Grant Date

Title

Related Topics

Status

10/17/2022

1/30/2024

Natural language processing, Instruction processing, Computational linguistics, Artificial neural networks, Parallel computing

Grant

Application Date

10/17/2022

Grant Date

1/30/2024

Title

Related Topics

Natural language processing, Instruction processing, Computational linguistics, Artificial neural networks, Parallel computing

Status

Grant

Latest D-Matrix News

The Rise of Non-NVIDIA GPUs

Aug 28, 2024

AI chip startups are focused on delivering top-tier products and are unafraid to compete directly with NVIDIA. Share Listen to this story NVIDIA may reign as the king of GPUs, but competition is heating up. In recent years, a wave of startups has emerged, taking on the Jensen Huang-led giant at its own game. Tenstorrent, a startup led by Jim Keller, the lead architect of AMD K8 microarchitecture, is developing AI chips that the company claims perform better than NVIDIA’s GPUs. “We have a very power-efficient compute, where we can put 32 engines in a box, the same size as NVIDIA puts eight. With our higher compute density and similar power envelope, we outperform NVIDIA by multiples in terms of performance, output per watt, and output per dollar,” Keith Witek, chief operating officer at Tenstorrent, told AIM. (Wormhole by Tenstorrent) NVIDIA’s chips used in the data centres need silicon interposers like HBM memory chips. Companies like Samsung and SK Hynix, along with NVIDIA, have also made millions selling these chips. However, Tenstorrent chips eliminate the need for these chips. Similarly, Cerebras Systems, founded by Andrew Feldman in 2015, has developed chips to run generative AI workloads such as training models and inference. Their chip, WSE-3– is the world’s largest AI chip– with over 4 trillion transistors and 46225mm2 of silicon. The startup claims its chips are 8x faster than NVIDIA DGX H100 and are designed specifically to train large models. (World’s largest AI chip- WSE-3) Startups Building for the Inference Market There are startups developing chips designed specifically for inferencing. While NVIDIA’s GPUs are in great demand because they are instrumental in training AI models, for inference, they might not be the best tool available. D-Matrix, a startup founded by Sid Sheth, is developing silicon which works best at inferencing tasks . Its flagship product Corsair is specifically designed for inferencing generative AI models (100 billion parameter or less) and is much more cost-effective, compared to GPUs. “We believe that a majority of enterprises and individuals interested in inference will prefer to work with models up to 100 billion parameters. Deploying larger models becomes prohibitively expensive, making it less practical for most applications,” he told AIM. Another startup that is locking horns with NVIDIA in this space is Groq, founded by Jonathan Ross in 2016. According to Ross, his product is 10 times faster, 10 times cheaper, and consumes 10 times less power. Groq is designed to provide high performance for inference tasks, which are critical for deploying AI models in production environments. Recently, another player, Cerebras, announced its Cerebras inference, which they claim is the fastest AI inference solution in the world. It delivers 1,800 tokens/sec for Llama3.1 8B and 450 tokens/sec for Llama3.1 70B, which is 20x faster than NVIDIA GPU-based hyperscale clouds. Challengers in the Edge AI Market While NVIDIA may have made its name and money by selling GPUs, over the years, it has also expanded in other segments, such as developing chips for humanoids, drones, and IoT devices. SiMa.ai, a US-based startup with strong roots in India, is building chips which can run generative AI models on the embedded edge. Founded by Krishna Rangasayee in 2018, the startup takes NVIDIA as its biggest competitor. Rangasayee believes multimodal AI is the future and the startup’s second-gen chip is designed to run generative AI models on the edge– on cars, robotic arms, humanoids, as well as drones. “Multimodal is going to be everywhere, from every device to appliances, be it a robot or an AI PC. You will be able to converse, watch videos, parse inputs, just like you talk to a human being,” he told AIM. Notably, SiMa.ai’s first chip, designed to run computer vision models on edge, beat NVIDIA on the ML Perf benchmarks. Another competitor of NVIDIA in this space is Hailo AI. It is building chips that run generative AI models on the edge. Everyone Wants a Piece of the Pie Notably, these startups are not seeking a niche within the semiconductor ecosystem. Instead, they are focused on delivering top-tier products and are unafraid to compete directly with NVIDIA. They all want a piece of the pie and are already locking horns with NVIDIA. D-Matrix, for instance, counts Microsoft, which is one of the AI model builders, as its customers. Sheth revealed that the company has customers in North America, Asia, and the Middle East and has signed a multi-million dollar contract with one of its customers. The point here is that Microsoft is one of NVIDIA’s biggest enterprise customers. Cerebras also counts some of the top research and supercomputing labs as its customers. Riding on the success, the startup plans to go public this year. Rangasayee previously told AIM that his startup is in talks with many robotics companies, startups building humanoids, public sector companies as well as some of the top automobile companies in the world. They All Might Lose to CUDA All these startups have made substantial progress and some are preparing to launch their products in the near future. While having advanced hardware is crucial, the real challenge for these companies will be competing against a monster – CUDA. These startups, which position themselves as software companies which build their own hardware, have come up with their own software to make their hardware compatible with their customer’s applications. For example, Tenstorrent’s open-source software stack Metalium is similar to CUDA but less cumbersome and more user-friendly. On Metalium, users can write algorithms and programme models directly to the hardware, bypassing layers of abstraction. Interestingly, they have another one called BUDA, which represents the envisioned future utopia, according to Witek. “Eventually, as compilers become more sophisticated and AI hardware stabilises, reaching a point where they can compile code with 90% efficiency, the need for hand-packing code in the AI domain diminishes.” Nonetheless, it remains to be seen how these startups compete with CUDA. Intel and AMD have been trying for years, yet CUDA remains NVIDIA’s moat. “All the maths libraries… and everything is encrypted. In fact, NVIDIA is moving its platform more and more proprietary every quarter. It’s not letting AMD and Intel look at that platform and copy it,” Witek said. 📣 Want to advertise in AIM? Book here

D-Matrix Frequently Asked Questions (FAQ)

  • When was D-Matrix founded?

    D-Matrix was founded in 2019.

  • Where is D-Matrix's headquarters?

    D-Matrix's headquarters is located at 5201 Great America Pkwy, Santa Clara.

  • What is D-Matrix's latest funding round?

    D-Matrix's latest funding round is Series B.

  • How much did D-Matrix raise?

    D-Matrix raised a total of $154M.

  • Who are the investors of D-Matrix?

    Investors of D-Matrix include M12, Playground Global, Microsoft, Temasek, Marvell and 7 more.

  • Who are D-Matrix's competitors?

    Competitors of D-Matrix include Groq and 4 more.

  • What products does D-Matrix offer?

    D-Matrix's products include d-Matrix Corsair.

Loading...

Compare D-Matrix to Competitors

Mythic Logo
Mythic

Mythic is a high-performance analog computing company specializing in AI acceleration technology. Their products include the M1076 Analog Matrix Processor and M.2 key cards, which provide power-efficient AI inference for edge devices and servers. Mythic primarily serves sectors that require real-time analytics and high data throughput, such as smarter cities and spaces, drones and aerospace, and AR/VR applications. Mythic was formerly known as Isocline Engineering. It was founded in 2012 and is based in Austin, Texas.

Cerebras Logo
Cerebras

Cerebras focuses on artificial intelligence (AI) work in computer science and deep learning. The company offers a new class of computers, the CS-2, which is designed to train AI models efficiently, with applications in natural language processing (NLP), computer vision, and computing. Cerebras primarily serves sectors such as health and pharma, energy, government, scientific computing, financial services, and web and social media. It was founded in 2016 and is based in Sunnyvale, California.

Axelera AI Logo
Axelera AI

Axelera AI specializes in AI hardware acceleration for generative AI and computer vision inference within the technology sector. The company offers products such as AI acceleration hardware and software solutions that facilitate advanced computer vision and AI capabilities at the edge. It offer products that are designed to enhance the performance of AI applications, providing a cost-effective and user-friendly platform for innovation in AI. It was founded in 2021 and is based in Eindhoven, Netherlands.

Tenstorrent Logo
Tenstorrent

Tenstorrent focuses on developing hardware for deep learning within the AI and machine learning industries. It offers a range of products, including accelerated compute IP solutions, AI computer cards, customizable software, and dense AI/ML data center systems. Its products are designed to cater to the needs of the data center and edge computing sectors. It was founded in 2016 and is based in Toronto, Canada.

S
SEMRON

Semron specializes in the development of chips within the semiconductor industry, focusing on intelligence density. The company's main offerings include AI chips that perform computations and a smaller physical footprint, utilizing their proprietary capram technology for energy-efficient operations. It was founded in 2020 and is based in Dresden, Germany.

B
BITMAIN

BITMAIN manufactures the digital currency mining sector, specializing in mining servers. The company offers technology power efficiency and provides computational infrastructure solutions to the global blockchain network. It primarily serves the cryptocurrency mining industry. It was founded in 2013 and is based in Beijing, China.

Loading...

CBI websites generally use certain cookies to enable better interactions with our sites and services. Use of these cookies, which may be stored on your device, permits us to improve and customize your experience. You can read more about your cookie choices at our privacy policy here. By continuing to use this site you are consenting to these choices.