AGI Technology, a leading provider of storage solutions, welcomes you to CES 2025. From January 7 to January 10, join us at The Venetian Resort Las Vegas, Level 2 Meeting Rooms, Bellini 2001B, to discover a variety of products tailored to meet the diverse needs of creators and professionals.Event Details Date: January 7–10, 2025 Time: 9:00 a.m. – 6:00 p.m. Location: The Venetian Resort Las Vegas, Level 2 Meeting Rooms Room: Bellini 2001BHighlighting AGI's Advanced Storage SolutionsSupreme Pro TF138 2TB microSD Card The Supreme Pro TF138 is the world's first 2TB microSD card, built to handle demanding applications. Ideal for professional videographers, gamers, and mobile users, it supports extended video recording, seamless file management, and expanded storage capacity. Paired with its Type-C card reader, the TF138 delivers swift data transfers of up to 170/160 MB/s, ensuring reliable performance across various uses.CF Express Type B Card AGI's CF Express Type B Card offers a perfect balance of speed and capacity for high-resolution workflows. Available in 256GB and 512GB models, it delivers sequential read speeds of up to 1,700 MB/s and write speeds of up to 1,000 MB/s, making it an excellent choice for 4K and 8K video production. This card offers dependable storage for professionals who require efficiency and precision in their work.EDM38 Magnetic Portable SSD The EDM38 Magnetic Portable SSD redefines portability with its practical magnetic attachment, enabling easy and secure storage on the go. With transfer speeds of up to 2,050/1,800 MB/s and available in 1TB or 2TB capacities, it supports ProRes 4K video recording. Compatible with Windows, macOS, and Android, it provides a versatile solution for busy professionals.AGI Technology will also be showcasing a range of new accessories at CES 2025, designed to complement its core storage lineup. Attendees are encouraged to visit Bellini 2001B to explore these solutions and connect with AGI's expert team.Dynamic Solutions with AGI Innovations at CES 2025.
MicroEJ, a global leader in embedded software solutions, is introducing VEE Wear 2, the newest version of its Wearable OS, designed to set new standards in battery efficiency, customization, and health tracking for wearables. Building on a year of market insights and customer feedback, the second generation of VEE Wear offers up to 3x the battery life of competing solutions, enabling manufacturers to develop differentiated, feature-rich smartwatches for all market segments—from entry-level to premium models—without relying on resource-heavy operating systems.Leveraging partnerships with Polar, LifeQ, B-Secur, Facer, as well as NXP Semiconductors, Qualcomm, and others, VEE Wear 2 introduces new health and fitness tracking capabilities, audio enhancements, a complete development framework, and a scalable app ecosystem with hundreds of thousands of watch faces. Together, these elements turn marketing innovation into products at a much faster pace, for consumer-focused devices." VEE Wear 2 is built for watch makers and tens of thousands of UX and App designers, seamlessly bridging the real and digital worlds of wearables", said Dr. Fred Rivard, CEO of MicroEJ. "By disrupting what's possible in the smartwatch market—particularly with extended battery life, optimized size, Android-compatible tools and support from technology partners - we've removed many technical barriers for manufacturer. This allows them to focus on branding, innovation, and delivering standout products that meet the aspirations of their end users."Unparalleled Battery LifeBattery performance is essential as consumers expect more from their smartwatches without needing frequent recharging, while also wanting sleek designs that can't accommodate large batteries. By running on low-power and cost-effective microcontrollers and through intelligent task distribution and adaptive power management, VEE Wear 2 enables mid-range RTOS smartwatches to achieve up to 40 days of battery life, and high-end Android-based models to deliver up to 3 days on a single charge, even with advanced features like GPS and health monitoring.A New Era of Health Monitoring, Fitness Tracking, and AR in WearablesIn partnership with B-Secur, whose FDA-cleared ECG technology powers advanced cardiac insights, and LifeQ, VEE Wear integrates medical-grade health monitoring, from ECG analysis and sleep tracking to insights based on over 150 physiological biomarkers. The incorporation of Polar's suite of 25 proprietary algorithms, backed by 40+ years of scientific research, further elevates fitness-tracking capabilities, enabling athlete-level monitoring and coaching. Additionally, the integration of ActiveLook's AR technology brings real-time, hands-free data directly into the user's line of sight during sports and fitness activities, offering a transformative experience without compromising battery performance.Pioneering Customization and Brand IdentityWith this new release, VEE Wear stands out in the market with its comprehensive customization options, from UX/UI design to sensor integration. With customizable APIs, manufacturers can integrate advanced features such as audio, health metrics, GPS, and notifications with ease. Tight integration with Facer provides over 500,000 unique watch faces in VEE Wear, allowing brands to offer consumers extensive personalization options. Built-in simulation tools and a complete development framework accelerate the design process, empowering manufacturers to bring high-impact products to market faster, without the constraints of closed ecosystems.Enabling Continuous Innovation with a Scalable App EcosystemWith its scalable application ecosystem, VEE Wear enables brands to create fully branded smartwatches, establish their own app stores, and expand revenue opportunities. This adaptable framework allows for continuous innovation, supporting seamless updates and new features to enhance the user experience over time.Explore VEE Wear 2 and its revolutionary advancements at CES 2025 from January 7-10 at the Venetian Expo, Booth #52823. For more information, download the product brief or visit https://www.microej.com/product/veewear.Disrupting Smartwatch OS Offers Advanced Health Tracking, Endless Customization Capabilities, and Ultra-Low Power Performance
Taipei Blockchain Week (TBW) 2024 was nothing short of incredible! The event not only brought together blockchain enthusiasts from across the globe but also highlighted the energy and enthusiasm of the Taiwanese Web3 community. For Tevau, this wasn't just an industry event — it was a significant milestone in our journey to connect with one of the most vibrant and passionate crypto communities in Asia.A Thriving Booth with Nonstop ExcitementFrom the very first day, Tevau's booth became a hub of activity. The excitement was palpable as attendees crowded around, eager to learn about Tevau's vision for the future of payments. Conversations flowed effortlessly as we showcased how Tevau is reshaping financial tools to make transactions seamless and accessible for everyone. The overwhelming interest we received underscored the Taiwanese community's openness to innovation and its eagerness to embrace cutting-edge solutions.A Thriving Booth with Nonstop ExcitementInspiring Panel: Revolutionizing PaymentsOne of the highlights of our participation was Rosina, Tevau's Strategic Director for East Asia, speaking on a panel about Revolutionizing Payments. The discussion delved into how Web3 technologies, particularly stablecoins, can create a bridge for Web2 users to enter the decentralized world with confidence and trust. Rosina's insights on the role of digital payment solutions in building an inclusive financial ecosystem resonated strongly with the audience.Inspiring Panel: Revolutionizing PaymentsBringing Tevau Closer to the Taiwanese CommunityTBW 2024 was also a unique opportunity for us to connect on a deeper level with the local community. From blockchain professionals to crypto-curious attendees, we had the privilege of exchanging ideas, hearing their feedback, and exploring how Tevau could contribute to Taiwan's thriving ecosystem.We believe in the power of partnerships, and Taiwan is a key focus for Tevau as we continue to expand. This event not only introduced our platform to new audiences but also reinforced our commitment to nurturing strong ties with local communities.Looking Forward: Collaboration Opportunities with TaiwanAs we look ahead, we're doubling down on our focus on Taiwan. We're excited to work closely with Taiwanese KOLs, media outlets, and community leaders to further our mission. If you share our vision of empowering financial freedom through innovation, we warmly invite you to collaborate with us. Let's create something extraordinary together!
With the rapid growth of AI, high-performance computing (HPC), and cloud computing, data centers face increasing challenges regarding performance demands, deployment flexibility, and energy efficiency. To address these issues, NVIDIA has introduced the MGX modular server architecture, which offers unparalleled flexibility and scalability, providing a revolutionary solution for modern data centers and redefining the future of computing.The MGX modular server architecture, developed by NVIDIA, is designed to accelerate AI and HPC advancements. Centered on GPUs, MGX adopts standardized hardware specifications and flexible configurations. It supports NVIDIA Grace CPUs, x86 architecture, and the latest GPUs, such as the H200 and B200, while integrating advanced networking capabilities with BlueField DPUs. MGX is compatible with 19-inch standard racks, supporting 1U, 2U, and 4U servers, and is further enhanced with compatibility for OCP standards. Beyond supporting multiple generations of GPUs, CPUs, and DPUs, MGX's modular pool includes I/O modules, PCIe cards, add-in cards, and storage modules, enabling over 160 system configurations to meet diverse data center needs, shorten development cycles and reduce costs.The modular and highly flexible design of the MGX architecture helps enterprises deploy HPC solutions swiftly, offering diverse scenario-based solutions for cloud service providers (CSPs) and enterprise data centers. First, in AI and generative AI, MGX excels in deep learning, leveraging modular design and multi-GPU parallel computing to accelerate AI model training. It supports applications such as speech recognition, image processing, and large language models (LLMs). For high-performance computing, MGX's multi-GPU parallel processing capabilities empower scientific research, deep learning training, and autonomous driving applications, serving as a significant breakthrough in the computing revolution. Lastly, MGX's modular design enables CSPs to deploy cloud servers rapidly in cloud and edge computing applications. Supporting multi-generation CPUs, GPUs, and DPUs, it delivers comprehensive solutions for cloud services. Moreover, MGX's high performance and compact design for edge computing meet low-latency demands, such as real-time image analysis and decision-making in smart cities and autonomous driving applications.As an NVIDIA partner, Chenbro is dedicated to promoting the adoption and expansion of the MGX architecture by offering versatile server chassis solutions. Chenbro collaborates with system integrators and server brands to develop customized solutions, ranging from open chassis, to JDM/ODM and OEM plus services. Whether for standardized deployments or fully customized server designs, Chenbro ensures each customer receives solutions tailored to market demands, supporting diverse deployments in AI, HPC, and big data.1U/2U Compute Trays Supporting GB200 NVL72/NVL36 Liquid-Cooled Racks (Single Rack Version)Chenbro provides 1U and 2U Compute Trays compatible with GB200 NVL72 and NVL36 server racks, a high-density server solution designed by NVIDIA. The 1U configuration supports two compute boards per tray, each with two Blackwell GPUs and one Grace CPU, collectively known as the GB200 Superchip. The MGX standard rack houses 18 compute trays, offering 36 Grace CPUs and 72 Blackwell GPUs. For the 2U configuration, nine compute trays per rack combine 18 Grace CPUs and 36 Blackwell GPUs.The GB200 NVL72 and NVL36 utilizes a "Blind Mate Liquid Cooling Manifold Design" for efficient cooling, ensuring stable operation under prolonged high workloads. Additionally, NVLink technology achieves data transfer speeds of up to 1,800 GB/s, significantly enhancing data processing efficiency. This makes it ideal for AI training, cloud computing, and large-scale data processing scenarios, providing robust support for computationally intensive applications such as speech recognition, natural language processing (NLP), and AI inference. With its modular design and exceptional performance density, this rack solution helps enterprises establish the next generation of AI factories.2U MGX Chassis: Flexible Configuration and Future-Proof Compatibility for Enterprise-grade Server Chassis SolutionsFocused on AI server applications, Chenbro's 2U enterprise-grade server chassis solutions are built on the MGX system. Their modular design and future-proof expandability support GPU, DPU, and CPU upgrades, allowing subsystem reuse across various applications and generations. Designed to fit standard EIA 19-inch racks, the 2U MGX chassis supports traditional PCIe GPU cards, with configurations accommodating up to four GPU cards and air-cooled solutions. With modular bays of varying sizes, the chassis enables users to customize accelerated computing servers to meet specific application needs.With MGX's flexibility and scalability, Chenbro collaborates with system integrators and server brands to develop customized solutions for AI training, large dataset processing, and other high-performance applications. These tailored AI server solutions meet diverse cross-industry AI requirements.4U MGX Air-Cooled and Liquid-Cooled Chassis Solutions to Meet Future Enterprise Data Center NeedsDeveloped in collaboration with NVIDIA, Chenbro's 4U MGX air-cooled chassis solution is designed for AI training and HPC applications. It supports up to eight double-width GPGPU or NVIDIA H200 GPUs, and features five front-mounted and five mid-mounted 80x80 fans brackets for efficient cooling. The air-cooled 4U MGX is compatible with standard EIA 19-inch racks.The 4U MGX liquid-cooled chassis, on the other hand, supports up to 16 liquid-cooled single-slot GPUs. Utilizing liquid cooling manifold technology, it distributes coolant efficiently to GPU assemblies, motherboards, and switchboards, ensuring superior thermal management and energy efficiency for high-density, long-duration operations. The liquid-cooled 4U MGX chassis must operate within MGX standard racks. Its design suits enterprise scenarios requiring scientific computation, big data processing, AI training, and HPC.Unlike the high-density GPU solutions by NVL72 and NVL36 racks, the 4U MGX adopts a traditional Intel and AMD x86 CPU architecture, targeting enterprise users rather than large CSP scenarios. Both air-cooled and liquid-cooled 4U MGX solutions provide system integrators with greater flexibility to design proprietary MGX-compliant motherboards. Chenbro collaborates with clients to offer tailored server chassis solutions, meeting enterprise demands with flexible and efficient server solutions.NVIDIA MGX Partner: Chenbro Driving the Evolution of AI and Data CentersThe NVIDIA MGX modular server architecture, with its exceptional flexibility, performance, and broad applicability, has become a pivotal milestone in the evolution of data center technology. As a partner, Chenbro actively engages in chassis design and production to promote the adoption and realization of this architecture, offering faster and more flexible server solutions. From GB200 NVL72/NVL36-compatible 1U/2U Compute Trays to 2U and 4U MGX chassis solutions, Chenbro's standard products and customized solutions not only meet current market demands but also lay a solid foundation for future AI server applications.Looking ahead, the MGX architecture will continue to lead technological advancements in data center technology. With the rapid development of AI, 5G, and edge computing technologies, its application range will expand further, driving diverse data center solutions. Chenbro is building ecosystems around these architectures and will continue collaborating with NVIDIA and global clients to introduce next-generation AI server products. For more information about MGX products, please get in touch with Chenbro's sales representatives or visit the official website.NVIDIA MGX Partner: Chenbro Driving the Evolution of AI and Data CentersChenbro Launches NVIDIA MGX Server Chassis Solutions for Empowering AI and Data Center Development
Global Unichip Corp. (GUC), a leading provider of cuttingedge ASIC (Application-Specific Integrated Circuit) design services, today announced it is joining the Arm Total Design ecosystem. This collaboration highlights GUC's commitment to deliver comprehensive and innovative design solutions, enabling customers to accelerate the development of advanced semiconductor innovations.As part of the Arm Total Design ecosystem, GUC will gain preferential access to the cuttingedgeArm Neoverse CSS compute platforms that underpin purpose-built AI SoC solutions forcloud data centers, HPC and edge. Combining GUC's rich expertise in chiplet and 3DICtechnology, enables GUC to deliver comprehensive and differentiated services in nextgeneration system integration, pushing the boundaries of ASIC and chiplet design, and offering innovative solutions optimized for high-performance computing applications."Our ultimate goal is to enable powerful but cost-efficient Arm Neoverse CSS-poweredprocessors using TSMC's 3D SoIC-X technology with CPU cores implemented in the mostadvanced process nodes while keeping SLC cache, CMN and UCIe at mainstream process andPCIe and DDR at separate chiplets," said Igor Elkanovich, CTO of GUC. "GUC will contribute to the joint effort with its silicon-proven, very low latency 3D interface GLink-3D and our siliconcorrelated 3D flows: 3Dblox, physical implementation, power distribution, thermal and mechanical.""Joining the Arm Total Design ecosystem represents a key step in GUC's strategy to enhance our custom silicon capabilities," said Aditya Raina, CMO of GUC. "By leveraging the Arm Neoverse CSS and TSMC's 3DFabric technology, we are well-positioned to offer customers groundbreaking solutions that incorporate advanced chiplet design and 3DIC capabilities. We are excited about the possibilities this collaboration will unlock for next-generation SoC designs.""The Arm Total Design ecosystem is fostering collaboration and providing the flexibility needed to create new cutting-edge silicon to take on intensive AI-powered workloads," said Eddie Ramirez, vice president of go-to-market, Infrastructure Line of Business, Arm. "GUC's innovative ASIC and 3DIC solutions will help the ecosystem harness the efficiency of the Neoverse CSS, reduce time-to-market, and inspire a new generation of Arm-based chips to power datacenters sustainably."This partnership will enable GUC to provide customers with preferential access to Arm Neoverse CSS compute platforms, ensuring rapid deployment of advanced ASICs, chiplets, and 3DIC solutions for a wide range of applications, including data centers, edge computing, and high-performance computing.For more information, please visit our website:http://www.guc-asic.com
MicroEJ, a leader in embedded software, unveils VEE Energy — a solution that transforms standard meters into agile, AI-enabled smart devices, revolutionizing how utilities manage grid infrastructure with no costly hardware replacement. With VEE Energy, metering companies gain the flexibility to deploy intelligence at the edge, bringing complexity-free innovation to the smart grid with the same transformative effect that apps brought to smartphones. This software-defined approach allows companies to break free from hardware constraints through app upgrades, transforming the pace and accessibility of energy innovation.MicroEJ is a trusted partner for leaders in the energy sector, driving software-defined advancements on tens of millions of devices. Industry giants like Schneider Electric and Landis+Gyr rely on VEE Energy to enhance grid reliability and unlock new possibilities for application development on smart endpoints, including meters and network interface cards—all on cost-effective hardware.Unlocking AMI 2.0 with Smarter Meters and Intelligent Endpoints As utilities transition to AMI 2.0, they face growing challenges, including increased electricity demand, renewable integration, undersized grids, rising energy costs. According to the 2024 iTRON Resourcefulness Report, 80% of utilities invest in AI to improve grid monitoring and anomaly detection, yet 50% lack the expertise to implement it effectively. VEE Energy is the version of MICROEJ VEE built by and for the energy actors and bridges this gap, offering a cost-effective and scalable path to AI-driven innovation on existing hardware.VEE Energy leverages cloud technologies adapted to the edge to create an evolutive and flexible application platform tailored to energy management needs. It allows utilities and third parties to easily deploy edge AI applications on their meters, without costly hardware upgrades, transforming endpoints into dynamic, intelligent devices."VEE Energy empowers utilities to lead the next era of energy management," says Dr. Fred Rivard, CEO of MicroEJ. "The energy sector is at a turning point where edge intelligence complements cloud analytics to overcome today's challenges. With VEE Energy, meters, network interface cards, and gateways are evolving from simple endpoints to essential components of AMI 2.0, helping utilities meet current and future challenging demands."Key benefits of VEE Energy include:*Enhanced grid management for distributed energy resources like solar and EVs*Flexible app deployment without hardware disruption*Granular, real-time data insights for enhanced safety and improved consumer engagement*Enhanced security with memory-safe software, aligned with CISA's guidancePartnerships with Industry LeadersMicroEJ is trusted by top industry players to drive faster innovation. Alongside other new-generation meters on the US market, since 2022, the company has notably enabled Landis+Gyr's Revelo® meter to deliver advanced edge intelligence and sensing capabilities, helping utilities and consumers optimize energy usage. This collaboration enhances grid reliability and accelerates the development of new applications.As an example of another type of collaboration, Schneider Electric uses MicroEJ's solution to integrate software-defined architectures into its EcoStruxure platform, advancing energy efficiency and sustainability.MicroEJ also works with technology innovators in the energy sector to bring comprehensive AMI 2.0 solutions, the first example being, Kalkitech, a leader in AMI 2.0 communication, who supports VEE Energy.Explore VEE Energy and its revolutionary advancements at CES 2025 from January 7-10 at the Venetian Expo, Booth #52823. For more information, download the product brief or visit https://www.microej.com/product/vee-energy.MicroEJ safe app ecosystem enables utilities to deploy edge intelligence with no need for costly hardware upgrades
Alpha Networks, committed to pushing the boundaries of AI-powered content discovery, presents TUCANO to supercharge content aggregation with a passion to shape the future of video.With online video streaming taking up 80% of the global internet traffic (according to Cisco's Visual Networking Index), there's an unprecedented demand for high-quality, personalized, and engaging content. AI is at the forefront of this transformation, providing the tools to deliver better user experiences, optimize content delivery, and create content.AI-Powered TUCANO, Revolutionizing Content Discovery and User ExperienceAlpha Networks believes the future lies in recommending the right robust content and delivering that in the most user-friendly way, tailored to individual preferences. AI isn't just a tool for automation; it's a bridge between content, context, and user intent.Key features of Tucano to push the envelope of the video include:*Enhanced Metadata: Tucano's AI dynamically analyzes video content, generating semantic tags, timestamps, and rich synopses—ensuring data accuracy and enriching user searches.*Highlight Extraction: Tucano identifies key moments within programs, making it easier for users to discover the most compelling parts of content.*Dynamic Editorial Tools: Editors are empowered with AI-driven insights to curate segment-specific homepages or navigation flows, creating personalized user interfaces with minimal effort.*Adaptive Navigation: Tucano tailors app structures to user preferences, ensuring TV fans and SVOD enthusiasts alike enjoy optimized interfaces based on their consumption habits.These continuous improvements reinforce Tucano's role as a core platform in modern video ecosystems, supported by a solid roadmap for future features and enhancements.TUCANO allows for the Monetization of your Digital Outreach through video advertising, Premium Content, and Subscription-based models, Sponsored Content and Brand Partnerships, Live Streaming as well as Personalized Video Ads without significantly increasing overhead.Guillaume Devezeaux, CEO of Alpha Networks said: "With a team of 150 experts, Alpha Networks is committed to pushing the boundaries of AI-powered content discovery. By improving data quality and empowering editorial teams, we create platforms that adapt to how users consume content—whether ad-supported or premium experiences. Our innovations enable platforms to bridge gaps between fragmented ecosystems, unlocking seamless, intuitive access to video content."CU at CES 2025, The Venetian Expo, #50752
Snowdrop Solutions recently announced its collaboration with BigPay to change the banking experiences for users in Thailand. Allowing Thai users to manage their money more efficiently, this strategic partnership focuses on improving financial management by integrating advanced technology into the BigPay platform. As digital banking advances, this collaboration showcases the importance of secure and user-friendly tools that improve customer interaction with their finances.Although the partnership addresses the challenges of traditional financial systems, there are alternative payment solutions that also contribute to better money management. One of these solutions is the use of cryptocurrencies in online transactions. For example, crypto casino platforms provide an excellent demonstration of how digital currencies simplify payments while maintaining security and privacy. These systems allow users to access modern payment methods without the complexities associated with traditional banking, making crypto a viable option for individuals looking for flexibility in managing their funds.By using crypto as a payment method, these platforms are also able to provide a host of great benefits. These include crypto-centric perks that are a direct result of crypto's underlying blockchain technology, such as instant withdrawals and anonymous play.Anyway by integrating Snowdrop Solutions' innovative API technology, BigPay users in Thailand in general can benefit from enriched transaction data that delivers clear insights into their spending habits. The transaction enrichment API, known as MRS API, was designed to help users understand their financial activities through accurate and detailed information. This capability simplifies money management by offering personalized insights and improving the overall user experience. Users can now track their transactions with clarity, reducing the confusion often associated with generic banking records.Moreover, the enriched transaction data includes precise merchant names and logos, giving users a detailed view of their spending patterns. This transparency aligns with the growing demand among Thai consumers for better financial tools that combine convenience and accuracy. By addressing these needs, Snowdrop Solutions and BigPay are creating a foundation for a more intuitive banking experience that resonates with the shift toward cashless transactions in Thailand.Another noteworthy advantage of this partnership is its alignment with the Thai government's push toward digital payment adoption. As mobile banking and e-wallet usage continue to rise in the country, collaborations like this ensure that consumers have access to seamless and secure financial solutions. With BigPay leveraging Snowdrop's technology, users can manage their finances more confidently, knowing they have access to tools that promote informed decision-making and responsible spending.The integration also helps resolve common customer pain points. By providing detailed, easily comprehensible transaction data, the platform simplifies financial decision-making for users. This not only improves their understanding of personal finances but also empowers them to take charge of their spending and budgeting. Additionally, personalized tools and a user-friendly interface make financial management accessible to everyone, regardless of their level of financial literacy.This partnership also highlights the role of technology in advancing financial inclusion. By simplifying complex banking functions and making them more user-friendly, the collaboration ensures that more people can benefit from digital banking services. The ability to track spending accurately and access personalized insights also opens up opportunities for better financial planning and management.In addition to addressing immediate financial challenges, this partnership sets the stage for long-term benefits for Thai users. As technology continues to change and improve, partnerships like this are likely to pave the way for further advancements in the banking sector, making digital financial management more accessible and intuitive for all. Whether through enriched digital payments, transaction data, or alternative systems like cryptocurrency, these innovations are shaping the future where consumers have greater control, security, and transparency in managing their money.As consumers increasingly seek out tools that simplify their financial lives, innovations like enriched transaction data and user-friendly platforms will become essential. This collaboration demonstrates how technology can bridge the gap between traditional banking and the digital future, offering users greater control and clarity over their financial activities.
swisstech announced today its participation at CES 2025 in Las Vegas, the world's most influential tech event held January 7-10, 2025. With the theme, "Step into the future: Experience Swiss AI innovation," swisstech will present two dedicated pavilions featuring Switzerland's leadership in artificial intelligence, innovation, and its thriving business ecosystem.As part of CES Unveiled Amsterdam, Switzerland Global Enterprise will introduce Switzerland's world-class advancements in AI, highlighting solutions that accelerate digital transformation for businesses across a wide range of industries. The pavilions at CES Las Vegas will emphasize Switzerland as a hub for cutting-edge technologies, from AI-powered automation and robotics to advanced medtech, fintech, edutech and cybersecurity solutions. Switzerland recently debuted one of the fastest computers in the world, the new ethical supercomputer Alps, which will strongly contribute to research infrastructure.Key highlights include: Two Pavilions: Located at the Venetian Expo Eureka Park, Level 1, Hall G, Booth 61033 and Global Pavilion, Level 2, Hall D, Booth 50435 32 Swiss startups, leading companies, and research institutions contributing to Switzerland's reputation as a global hub for AI and technology will be featured.Collaboration Opportunities: Attendees will have the chance to engage with Swiss innovators, experience solutions live and explore partnership opportunities that leverage AI to solve real-world business challenges.Why Switzerland?Switzerland consistently ranks among the top nations for innovation, boasting a favorable regulatory environment, world-class universities, and a robust tech ecosystem. It is home to breakthrough developments in AI, driven by collaboration between the public and private sectors. In the field of academia, EPFL and ETH Zurich Universities are stepping up their efforts to lead Swiss AI research to the international forefront, enhancing their collaborations and announcing the foundation of the Swiss National AI Institute (SNAI) with the goal of addressing challenges in AI. With CES 2025's global reach, Switzerland aims to strengthen its position as a destination for businesses seeking AI solutions that are both impactful and sustainable.Switzerland's presence at CES 2025 underscores its commitment to supporting the growth of AI and digital transformation on an international scale. The event will be an opportunity for businesses to discover how Switzerland's AI ecosystem can drive the future of business globally.Eureka Park at the Venetian (booth #61033): 24 Companies8inks Absolute MagneticsAcktaoAerospecBeekeeCalopadClever Forever Educationcsky.aiEnerdrapeE-OutdoorGraphEnergyTechIdentic AILifehiveMagnesmimic roboticsNeural ConceptNutrixPerovskia SolarRimon TechnologiesSenbiosysswipSwistorWearin'SonixappGlobal Pavilion at the Venetian (booth #50435): 9 CompaniesAlgorizedAvatronicsEcorobotixGreenGTGLOBAL IDHemargroupHypergateSensoryxFor more information on Switzerland Global Enterprise at CES 2025, see: https://www.s-ge.com/en/event/swiss-pavilion/swisstech-pavilion-ces-2025?ct.Credit: swisstech
The High-performance computing (HPC) market is continuously gaining strong momentum, with research finding that the global market is expected to grow massively for applications requiring high data computation and increasing analysis levels. These advanced applications include high-frequency trading, autonomous vehicles, genomics-based precision medicine, computer-aided design and simulation, deep learning, and more. It refers to using powerful computing systems to quickly process massive influxes of data and solve complex problems largely fueled by advances in artificial intelligence (AI) technology. Organizations across industries are embracing AI to drive innovation and unlock new revenue streams. This AI imperative demands computing infrastructures that can process data-intensive workloads at unprecedented speeds and scale. However, this shift also brings significant challenges. While combined with rapid advancements in data center evolution and the promise of emerging technologies like quantum AI, it's clear the compute infrastructure is driving the need for the expansion of data center investments.Supermicro is a leader in the manufacture and design of high performance servers and storage solutions based on modular and open architecture. Many of Supermicro's servers are used for complex IT requirements and for performance and computing environments that require advanced power hungry solutions which need liquid cooling. The company recently announced the launch of new H14 generation servers, GPU-accelerated systems, and storage servers featuring the AMD EPYC 9005 Series processors and AMD Instinct MI325X GPUs. With Supermicro's tight relationship with CPU and GPU suppliers, AMD and Supermicro have long partnered on solutions that serve key customers in the digital economy. This partnership again leads to AMD support for Supermicro solutions in SuperComputing 2024 conference where Supermicro showcased its latest high-compute-density multi-node solutions optimized for high intensity HPC workloads.Supermicro solutions powered by AMDSupermicro's new H14 family uses the latest 5th Gen AMD EPYC processors, which is powering the high demanding and intensive enterprise and HPC workloads in the industry and enable up to 192 cores per CPU with up to 500W TDP (thermal design power). The company has designed new H14 servers, including the Hyper and the FlexTwin systems, which can accommodate the higher thermal requirements. The H14 lineups also include three systems for AI training and inference workloads supporting up to 10 GPUs, which feature the AMD EPYC 9005 Series CPU as the host processor and two that support the AMD Instinct MI325X GPU.Supermicro and AMD are collaborating and seizing the opportunity to establish themselves as leaders in AI-driven data infrastructure. According to Charles Liang, president and CEO, Supermicro, he claimed "Supermicro's H14 servers have 2.44X faster SPECrate2017_fp_base performance using the EPYC 9005 64 core CPU as compared with Supermicro's H11 systems using the second generation EPYC 7002 Series CPUs. This significant performance improvement allows customers to make their data centers more power efficient by reducing the total data center footprint by at least two-thirds while also adding new AI processing capabilities." The new H14 Supermicro product line, based on 5th Gen AMD EPYC CPUs, supports a broad spectrum of workloads and excels at helping a business achieve its goals, which can be highlighted as highest performance x86 server processor and leadership x86 energy efficiency.H14 server family fulfills multiple needs of modern data centers The rapid growth of consumer AI adoption has driven many large technology companies to accelerate their shift toward large language models and other AI technologies to provide innovative solutions and remain competitive in both the public and private sectors. Different types of workloads are used to accomplish different AI tasks. Various workloads are all addressed by the Supermicro H14 servers and storage systems. These include:High-Performance Computing (HPC) – HPC systems are used by more than university and national lab researchers. Now, more enterprises integrate HPC systems into everyday workflows to bring products to market faster and discover new vaccines and drugs. Advanced HPC systems require fast cores, large amounts of memory, and fast networking between systems.Cloud – Designing and implementing a cloud solution requires a wide range of optimized products for different workloads, not just for environments where the price performance of the compute aspect is most important. Storage and networking are critical for a productive and cost-effective cloud data center.Artificial Intelligence (AI) – Systems with fast CPUs and associated GPU sub-systems are required for the growing AI use cases. Supermicro H14 servers can house up to 10 GPUs in a 5U rack height and excel at AI applications, enabling faster training and inference applications. Supermicro designs servers specifically to accommodate a high number of GPUs for maximum AI application performance. In addition, the Supermicro GPU servers incorporate the latest GPUs from several vendors in various form factors.Big-Data Analysis – As the volume of data generated everywhere explodes, the systems must access, analyze, and present structured and unstructured data to the user. These tasks require the ability to hold an increasing amount of data in memory, fast computation, and quick data communication to GPUs if needed.Virtualization – With many enterprises utilizing virtualization technologies to get higher utilization from existing servers, the new Supermicro H14 servers, with the 5th Gen AMD EPYC processors, allow for higher-powered virtualization machines, as more cores are available and faster CPUs.Enterprise – Typical enterprise workloads will benefit from the new Supermicro H14 systems with increased performance and reduced costs. In addition, existing workloads will execute faster, using less power than previous generations of Supermicro servers.The spotlighting features of Supermicro's H14 portfolio of serversManaging AI workloads in data centers can be difficult if the systems aren't ready to meet the need. Networking, processing, and scalability features must be in place for AI workloads to be functional. There are several unique strengths for Supermicor's H14 servers to achieve for responding these requirements.Broad selection—The Supermicro H14 product line offers a wide range of choices optimized for specific workloads. These product families consist of the following product families: Hyper-enterprise server, CloudDC versatile system optimized for use in cloud data centers, GrandTwin 4-node compute platform, FlexTwin 2U 4-node performance high-density compute system, and 4U/5U/8U GPU systems. The wide range of selections satisfies the expansion of data centers and the need for advanced infrastructure.Compute Power—The Supermicro H14 products with the AMD EPYC 9005 series processors offer top-level performance for many metrics and, when combined with high core counts, are ideal for a range of workloads. The solutions are purpose built to accelerate data center, cloud, and AI workloads driving new levels of enterprise computing performance.Max Core Counts—Supermicro H14 servers have been designed to house AMD's most powerful and energy-intensive CPUs for high-end computing environments with up to 192 cores in a single CPU, making the H14 servers the ideal rackmount solutions for Cloud, HPC, and AI applications. Furthermore, using a 48U rack as an example, FlexTwin can support up to 96 dual processor nodes and 36,864 cores within this rack size.Max Density—Supermicro H14 multi-node architectures leverage shared resources, including cooling and power supplies, to maximize energy efficiency, with compact node form factors that allow significantly increased compute and component densities compared to standard rackmounts. Taking the all-new H14 FlexTwin as an example, this new design is purpose-built for HPC at scale, with front-accessible nodes, flexible networking and storage, and direct-to-chip liquid cooling, providing outstanding density and optimized thermal performance.Thermal Design— by optimizing the airflow within a system, high-performing CPUs can be used without concern for overheating. Liquid cooling increases the compute density and lowers the data center power usage effectiveness (PUE).Deliver high-performance, liquid-cooled servers for unleashing the full potential of AIIn general, high-end server systems are also high-density, have higher performance and usually better efficiency, it also means higher power density. The GPU-dense systems designed for AI workloads has driven power demands from supporting 6-12 kilowatts per rack to 40-60, and even quickly ramping to 100-150 kilowatts per rack. Not surprisingly, moving to 500 kW per rack or even 1 MW per rack is an ongoing trend. However, considering airflow and containment are excellent methods to improve efficiency and density for now, current server solutions are quickly reaching the limits of the physics of airflow. And the next logical step is to turn to liquid cooling.Leveraging Direct-to-chip liquid cooling technology, Supermicro removes 90% of server-generated heat in FlexTwin systems. This ability has become a strategic advantage for Supermicro, aiming to harness AI server business and maintain a competitive edge. The new H14 server family showcased from Supermicro demonstrates significant technological advancements to show its strength to enhance liquid cooling capabilities. On the other hand, Supermicro works closely with customers to architect and design rack and entire data center solutions for HPC workloads. After the design is validated with customer close involvement, Supermicro offers on-site deployment services, reducing the time-to-deployment. With a global manufacturing footprint and production facilities, Supermicro can produce a total of 5,000 racks per month, including 2,000 liquid-cooled racks, with lead times of weeks, not months.These efforts make close partnerships with AI chipmakers allow Supermicro to gain early access to their new chips and produce their multiple server family products before competitors. This has been an important advantage and has attracted global hyperscalers to drive demand for its AI infrastructure. The pace of adoption of advanced AI use cases will certainly continue to evolve. The new server design will mix the of different types of chips deployed and their associated power consumption, as well as the balance between cloud and edge computing for AI workloads and the typical compute, storage, and network needs of AI workloads. Supermicro carved out its own niche by selling high-performance, liquid-cooled servers and quick ramping manufacturing capacity for demanding computing tasks. That made it an ideal partner for global chip makers like AMD, which supplied Supermicro with high-performance data center CPUs and GPUs to help it produce dedicated AI servers to grasp market share.Supermicro's H14 family is powered by the 5th Gen AMD EPYC processors which enable up to 192 cores per CPU with up to 500W TDP