Go to contents

XD NODE presents AI innovation with ultra-small NPU servers instead of high-cost GPUs

XD NODE presents AI innovation with ultra-small NPU servers instead of high-cost GPUs

Posted February. 27, 2026 13:22,   

- XD NODE, led by CEO Su-jong Cho, is disrupting the AI hardware market by developing the "XDN-E100 Mini Server," an ultra-compact and low-power alternative to expensive, high-heat GPU servers.
- The mini server is equipped with the domestic MOBILINT MLA100 NPU, offering server-class inference performance (80 TOPS) at a fraction of the cost—approximately 3 million won compared to 10 million won for traditional setups.
- Designed for office-desk use with minimal noise and power consumption, XD NODE aims to launch mass production in the second half of 2026 to provide startups and SMEs with affordable, on-premise AI infrastructure.


Su-jong Cho, CEO of XD NODE / source=IT dongA

Su-jong Cho, CEO of XD NODE / source=IT dongA


As Artificial Intelligence (AI) services permeate across all industries, the importance of high-performance hardware infrastructure to power them is growing by the day. However, existing Graphics Processing Unit (GPU) servers pose a significant economic burden, with implementation costs reaching tens of millions of won, while also emitting immense noise and heat and requiring extensive installation space. Because of this, many startups and small-scale enterprises frequently face frustration at the threshold of adopting A Iinfrastructure.

The startup 'XD NODE' is a deep-tech company that focused on precisely this point. Having previously focused on the distribution of high-performance GPU servers and infrastructure construction, they are now preparing to challenge the market by directly planning an ultra-compact AI server equipped with a domestic Neural Processing Unit (NPU) specialized for AI 'inference.' We met with Su-jong Cho, CEO of XD NODE, who aims to breathe new perspectives and vitality into the AI hardware market, to hear about the future of infrastructure innovation they are envisioning.

- I am curious about your career before founding the company and the motivation behind starting XD NODE
: Around 2017, I joined a partner company distributing NVIDIA products as a salesperson. At that time, before the AI boom, NVIDIA was mainly known as a graphics card company, and the enterprise GPU server sector received relatively less attention, with almost no sales. In that situation, I witnessed the entire process of the AI GPU server market being born and developing explosively on-site. Believing that AI would become the core of the Fourth Industrial Revolution, I staked everything and founded the company in 2023. Currently, XD NODE focuses on the distribution, infrastructure construction, maintenance, and technical support of NVIDIA GPUs and GPU-based servers. Recently, we have also started product development.

- What problems did you discover while experiencing the business?
: Small-scale companies like startups faced physical and environmental constraints. Even if they wanted to adopt existing high-performance GPU servers, they lacked space for a separate server room and professional personnel to manage it. The price is also expensive. Furthermore, it is nearly impossible to use existing GPU servers in a general office due to their characteristic immense noise and heat. I witnessed these circumstances countless times while working in sales on-site. Ultimately, many customers compromised with workstations, but the majority were not fully satisfied in terms of performance or quietness. A new alternative was needed to solve their concerns.

- What solution have you prepared to overcome those limitations?
: That is why low-power NPU technology is receiving attention recently. We are developing the 'XDN-E100 Mini Server' by applying this technology. It is equipped with the 'MLA100', a 25W-class NPU from the domestic AIsemiconductor company MOBILINT. The core of this product is its small size and low power consumption. Not only is the price lower compared to existing GPU servers, but the noise, heat, and power consumption are also dramatically reduced to the level of a regular PC. The body size is only about the size of two Korean dictionaries, so it can be used directly on an office desk.

- What are the application fields for NPU-based ultra-compact servers? How do they differ from existing GPU servers?
: This product is optimized for AI 'inference'—running already trained models for actual services—rather than AI 'training' which handles vast amounts of data. It is highly suitable for areas requiring real-time image inference, such as autonomous driving, object recognition in games, and vision AI fields using CCTV. Not every company needs to utilize the maximum performance of a tens-of-millions-of-won GPU server while enduring massive heat and power consumption. Our mini server is scheduled to be released at a price in the 3-million-won range; if limited to the inference domain, it offers powerful utility equivalent to existing 10-million-won GPU servers, preventing unnecessary performance and cost waste.

Cho introducing the MOBILINT MLA100 NPU to be installed in the XDN-E100 Mini Server / source=IT dongA

Cho introducing the MOBILINT MLA100 NPU to be installed in the XDN-E100 Mini Server / source=IT dongA


- What were the customer reactions when you introduced the product under development? I am also curious about the specific mass production schedule.
: When I introduced the product concept to customers during the planning stage, the majority showed deep interest. In particular, immediate reactions such as "I want to buy it as soon as it comes out" reached 30–40%, confirming that the market demand is clear. Currently, we are producing prototypes through collaboration with Advantech, a Taiwanese industrial PC specialist, and expect prototype delivery in early March. After going through a meticulous testing and optimization process, if there are no major issues, we plan to enter full-scale mass production and sales starting from the second half of this year.

- You are utilizing the SeoulTech Initial Startup Package program while pursuing your technical startup. How useful has it been?
: As a startup, we received tremendous help. First, we were able to secure excellent human resources in a timely manner through the support funds. Previously, I only thought of covering business funds through loans, but through the program, I realized the importance of Investor Relations (IR) activities, enabling more active and aggressive business development. Additionally, thanks to the numerous seminar and conference opportunities provided by Seoul National University of Science and Technology(SeoulTech), we were able to broadly expand our network with related organizations and partners. I strongly recommend that early-stage companies dreaming of a leap forward should definitely challenge such support projects.

- What is the final goal and vision that XD NODE aims to achieve in the market in the future?
: The existing server distribution market has a strong tendency to run by inertia. While AI technology and products are growing explosively every day, existing market participants often failed to fully keep up with the pace of this change. The AI-related hardware market has a completely different texture from the existing server market. I believe companies like XD NODE, which look at the market from a new perspective, must take an active role in breathing vitality into the market.

Simply holding sales channels and competing on price can never satisfy the heightened expectations of customers, nor is it likely to survive for long. For the many innovative companies looking to develop business in the AI era in earnest, XD NODE will be the most reliable and clear technical partner.

By Young-woo Kim (pengo@itdonga.com)