Kx Systems, a provider of high-performance database and timeseries analysis, is celebrating 15 years in business.
Massive data volumes, low-latency and event processing have been a recurring theme over the past fifteen years. A simple example of the ongoing challenges the world’s financial institutions face is NYSE volumes – in 1993 NYSE produced half a million TAQ (trades and quotes) per day; in 2008 this has risen to a staggering half a billion per day.
Back then multi-core machines did not exist as a commodity and would not become mainstream until the new millennium. Standard servers had 1 or 2 CPUs per machine; now we are looking at 8, 16, 32 and even 64 core machines. This requires software vendors to design and build their software to take full advantage of the available processing power and also be optimised for the future.
“From the outset we have designed our products in anticipation of vast increases in data volumes. It has always been our philosophy to make the most efficient use of existing hardware and to build-in sufficient redundancy and flexibility going forward. This allows us to make full use of the multi-core systems being brought to market, without having to rewrite our software,” says Arthur Whitney, co-founder, Kx Systems.
Kx has always made sure that it made the best use of existing and upcoming hardware and technology, working closely with the R&D divisions of companies such as Intel. Kx CTO Niall Dalton has extensive experience and deep understanding of hardware which gives Kx the advantage of a swift and effective response to hardware vendors as they bring new technologies to market. The constantly changing technology can place additional (unwelcome) pressures on clients, however Kx always tries to accommodate the new requirements within its software in order to protect and insulate clients from the myriad of technical changes.
The market has moved a very long way in the 15 years since Kx was set-up. The complexity and volumes of information and data that organisations have to process and store have increased dramatically. How financial institutions are using the data has also changed beyond all recognition. Algorithms are becoming ever more complex and dark pools of liquidity are adding to the challenges. The available speed and power of equipment is increasing, but so are the demands placed on both software and hardware. Market demands are continuing to place great pressure on technology; over the years Kx has consistently delivered improvement in performance in orders of magnitude.
“We have seen a lot of changes in the market, including consolidation and mergers of institutions. Our goal has always been to provide our clients with the most efficient, fast and flexible tools for processing real-time and stored data. Our client list speaks for itself and I am very proud that our team has built long-standing relationships with many top institutions around the world. We aim to continue to provide powerful tools for companies to tackle the most complex and data-intensive applications,” says Janet Lustgarten, CEO and co-founder, Kx systems.
Pavel Kocura, computer science lecturer at Loughborough University, has been teaching Kx’s programming language to his students since the late ’90s.
“I spent a number of years looking for a language that would provide a high-level ‘thinking tool’ for complex modelling and the highest possible performance for operations on extremely large structures. When Kx made their language available to the public in ’98 it was the obvious choice. When I introduced it to my students they loved it, even those who found the more traditional languages hard to learn. The language was easy to use and to teach; it is succinct and efficient and can be used in a wide range of disciplines, from financial markets, to engineering and science. A number of my former students are now working for major financial institutions, and still using Kx,” Kocura says.