Introduced at Ignite late last year, Azure HorizonDB is a new PostgreSQL service designed for higher performance and ...
Percona is refocusing on fast, structured database services to help enterprises overcome talent shortages, improve ...
The world tried to kill Andy off but he had to stay alive to to talk about what happened with databases in 2025.
Event-native data platform innovator Kurrent is releasing KurrentDB 26, adding native Kafka Source Connector, Relational Sink, and Custom Indices capabilities that reduce custom code requirements for ...
Bun 1.3 revolutionizes full-stack JavaScript development with unified database APIs and zero-config frontend setup. Experience enhanced performance with built-in Redis support and optimized bundling.
In 2026, contextual memory will no longer be a novel technique; it will become table stakes for many operational agentic AI deployments. At the beginning of the modern generative AI era, purpose-built ...
Real-time security clearances are becoming increasingly common in the manufacturing of advanced-node semiconductors, where data sharing is both essential and a potential security threat. Data security ...
Compensation shapes the culture of every architecture firm, whether you like it or not. It influences who joins your team, who stays, and how people imagine their future in the profession. Many ...
AMD recently held its first financial analyst’s day, where CEO Lisa Su AMD said the vendor now sees a total addressable AI market could be over $1 trillion by 2030, doubling last year’s stated target ...
Today all database connectors are bundled into the core package, which leads to: Large core package size (~200MB+) Unnecessary dependencies, even when only one database is used Hard-to-maintain ...
During a press Q&A at Confluent's Current 2025 conference this week in New Orleans, I asked CEO Jay Kreps whether it's useful to think about Confluent as establishing a "system of record for context" ...
TL;DR: Intel's new Crescent Island GPU, launching in 2026, features the advanced Xe3P architecture and up to 160GB LPDDR5X memory, optimized for power-efficient AI inference workloads in air-cooled ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results