Tuesday, February 26, 2008

Lab49 - have you looked at Esper?

Lab49 folks have been quite verbose in the Complex Event Processing (CEP) arena for financial services. They basically ended up partnering with pretty much anyone and almost every single month, skip to another CEP vendor to partner with.
Nice but certainly tough to build up and scale up product expertise especially given there is no much standard in the CEP arena so far - but that definitely gives them a nice view on each products.

To complete their 360° they should definitely look at Esper and EsperTech offering, including also NEsper as they seem to be quite involved in .Net as well. They probably know where to reach out for that, and will certainly hear about it anyway as the number of deployments of (N)Esper in financial services is growing nicely.

Few history on Lab49 ' somewhat fuzzy logic about partners:
- February 2008 - Aleri and Lab49 Partner on Tool to Visualize Market Liquidity
- January 2008 - BEA-Intel-Lab49 Whitepaper on CEP in Capital Markets (you can also find a more recent much vendor influenced podcast)
- November 2007 - Real-Time Trading and Industry Networking Event with Lab49 at StreamBase
- June 2007 - Coral8 and Lab49 Partner to Provide Algorithmic Trading Framework on the Microsoft Software Platform

Now you know what partner means. We are all partners to make the point that CEP brings business value, and Lab49 does it well in financial services I think.


Anonymous said...

Hi Alex - yes, we have indeed looked at (Java) Esper and have successfully implemented it on client projects. We have found that Esper works well in a number of use cases and when combined with the Spring Framework (which we are also a big fan and of and use a LOT).

To your other point: as most of our partners will attest (including the non-CEP ones), many of our partnerships have yielded real client-paying projects. We have many opportunities to partner with many different kinds of companies, none of which we will move forward with unless we feel there is a compelling technology that is applicable to our clients. Nothing goes unvetted.

Alex said...

Excellent Ross. I think that is indeed the best approach for everyone: no default answer or one size fits all solution and obviously successful projects require good team and alignment all along the stack.

I am glad to have you there be so verbose as it provides great feedback on why and how CEP brings business value. Your different partners approach obviously demonstrates great vendor neutrality which is likely a value you need to preserve in your service engagement business - yet still having first hand knowledge from technology providers when needed.

Anonymous said...

Given the range of functionality and different approaches from the CEP vendors, there really is no one size fits all - which is also why we are careful to retain the neutrality our clients ask us for. That said, some of the things we do find very interesting and useful are the "vertical frameworks" that come out of the box. In capital markets, things like market data adapters, support for pre-trade analytics, integration with common messaging and caching products.

I would be interested to hear your views on convergence in this space with respect to standards and integration (with grid, datacaches, etc).
I generally agree with your sentiments around benchmarking: most of it is meangingless, and the "millions per sec" is not only misleading, it is irrelevant unless you understand the specific use case. High throughput, low latency is the minimum qualification to play in this field so its not that interesting to me.

Ultimately, who do YOU think will win in the long run ?

Alex said...

Integration with existing assets and technology is indeed key (time, knowledge, reuse etc). I think CEP and distributed caches are particularly interesting together (koolaids though) - yet no all use cases require both, and latency vs capacity vs tiered processing is likely to require tradeoffs. I expect joint solutions to appear quite soon, and I think CEP is going to be pretty much "The" processing model of XTP platforms, with dist cache beeing the storage cloud - as application server and database are in the sort of classical world but in a much tightly integrated fashion and higher processing abstraction (essentially CEP is close to a DSL here).

There is also quite a lot of innovations happening that one must not miss especially from an XTP angle: column based storage, GPU processing, plus some hardware virtualization especially when used to enable dynamic ressources pools and live instance migrations.

I am also a big fan of saying that XTP and SOA meet in the middle thru CEP (northbound / southbound boundary) and the relationship of CEP with XTP deserves way more than a blog comment.