As data volumes continue to grow, even the highest powered servers can no longer keep up with the data throughput requirements for some applications. Cloud computing offers a framework for providing horizontally scalable, distributed computing services that can keep up with the demands of Big Data applications. NetOwl has been deployed on a variety of public and private clouds to help customers address their data analytics needs.
Public clouds such as Amazon’s Web Service’s (AWS) Elastic Compute Cloud (EC2) provide a very accessible platform that allows rapid deployment of NetOwl services on any number of nodes to meet individual customers’ throughput requirements. NetOwl is available via SaaS (Software-as-a-Service) for those customers who wish to use a NetOwl hosted service through its RESTful APIs.
Additionally, an Amazon Machine Image (AMI) that includes a NetOwl installation is available for immediate deployment of scalable NetOwl services across all different EC2 instance sizes. This allows customers to choose appropriately sized servers for their own applications. Other public clouds like those offered by Google, IBM, Microsoft and others can also run NetOwl services to provide advanced text and entity analytics for Big Data.
For customers who require private clouds, NetOwl has been providing the same text and entity analytics services in such closed environments. Whether on the same commodity hardware and infrastructure of the public cloud computing frameworks or on special-purpose hardware like the LexisNexis High Performance Computing Cluster (HPCC), where NetOwl is currently deployed, NetOwl software is easily deployable in all distributed computing environments.