Big Memory Go.
Big Memory Go.
Big Data is a Hot topic during these days. As Industry is moving towards the Big Data concept implementations and it adopted the Big data architecture by lots of leading software products.
I would call this as a next frontier for the innovation, competition towards the productivity.
The amount of data handling or processing in our world has been exploding, analysing of large data sets, This will become the key basis of future data handling.
Big Data comes handy when it integrate with the Big Memory concept as well. Performance of the system is the key factor for any system and customer expecting a high performance system with high availability. Usually Java applications run with in the specified JVM memory configurations, What if we system wants to use the Server memory or ram directly? Why keep unused memory in system resources ? Can you access system memory to save and retrieve objects effectively and store frequently used objects on ram instead of keeping those in JVM level data structures.
Yes, you can with Java's new direct memory accessing features. But why do we re-invent the wheel. I found a good library called BigMemory Go maintained by Terracotta.
Best feature I saw in this implementation is, it has free license version where users can extend their Big memory data collection upto 32GB amount the RAMs. and direct integration with EhCache data cacheing.
This Big memory implementation allow us to specify the no of elements need to be keep on EhCache and overflowing elements to be store on system memory. As I benchmark these implementation with SQL, Its obvious using this mechanism systems can reach high performance levels.