Vital A.I. Consulting Services

Application Development. Artificial Intelligence. Predictive Analytics. Data Science. Data Governance.
Our Team will help you launch Intelligent, Data-Driven Applications.


Vital A.I. Development Toolkit (VDK)

Use the Vital A.I. Development Kit (VDK) to rapidly build Intelligent Applications.
Download the VDK today!


Apache Spark and Hadoop

Let us help you utilize Apache Spark for lightning-fast data processing. Spark Logo

Download the Vital AI Development Kit (VDK) today!
Start building Intelligent Applications.

Download: Vital AI Development Kit

Vital AI Development Kit, Beta Release

The Beta Program includes:

Database Connections: Vital SQL, Vital SPARQL, Vital DynamoDB
Apache Spark: Aspen (Machine Learning on Spark)
Application Infrastructure: Vital Prime
Web Application Development: Vital JavaScript, Vital VertX
Visualization: Cytoscape Plugin
JVM Scripting: Groovy
Plus: Sample Code, Documentation, and Utilities

The free Beta Program is available until April 15, 2016 with ongoing releases.

The Vital AI Technology

Create Intelligent Data-Driven Applications.

Define your Data Model
Define your data model, generate code, and push to your application, across all architectural layers.
Configure Data Analysis
Configure data analysis components to process your data and create your desired output.
Connect a User Interface
View your data in reports, or enable your customers to dynamically interact with your data.

Data Analysis

With multiple data analysis modules, Vital AI enables various types of Artificial Intelligence.

Machine Learning

With machine learning, you can categorize data or make numerical predictions.

Natural Language Processing

With Natural Language Processing, you can categorize text, extract entities (names of people, places, organizations, and things), extract sentiment, extract relationships.

Graph Analysis

With graph analysis, including social network analysis, you can determine important items and people in networks.

Logical Inference

Using logical inference, you can use rules and infer new insights from your data.


Applications of Data Analytics and Data Science are numerous. Here are some example applications of Vital AI's technology.

Content and Product Recommendations

Combine Machine Learning and Natural Language Processing into a Recommendation System for your Content, Products, or other Data Objects.

Data Modeling

Model your data with Vital's tools to integrate multiple sources of data seamlessly.

Analytics-Driven Decisions, Optimization of Business Processes, Data Discovery, Hypothesis Generation

Use machine learning to predict outcomes, and make better decisions. Use in Life Science Research, Logistics, Green Energy.

Conversational Interactions

Use our conversational natural language interface to interact one-on-one with your users.

Vital AI Development Kit (VDK)

The VDK provides tools for data modeling, data object code generation, access to NoSQL data repositories, and data analysis modules for machine learning, natural language processing, and others.

With the VDK, your team can build, test, and deploy Intelligent Applications with processes to iterate and evolve your applications.

GUI and command-line tools are available to assist data modeling and deploying your code.

When it comes time to scale your application across additional server resources, push your code to the Vital AI Application Platform modules for deployment.

VitalSigns Code Generation

Use the vitalsigns command-line tool to generate your data-object code directly from your data model, keeping your code in sync with your data. Put an end to data mis-matches and integration problems.

NoSQL Data Repositories

Connect your application to scalable NoSQL data repositories such as DynamoDB, MongoDB, HBase, and Allegrograph.

Data Analysis Modules

Use machine learning prediction modules generated via Spark, Hadoop, and Natural Language Text-Analysis in your applications.

Data Visualization

Data organized by Vital AI may easily be visualized by a variety of visualization tools, allowing for intuitive understanding of data relationships.

Vital AI Application Platform

The Vital Client is added to your application, providing an API to access the Vital Platform. The Vital Client includes a cache of data objects to improve performance, the REST and Queue interfaces, and the VitalSigns data mapping component, which aligns your application data model with the Vital AI Platform. VitalSigns greatly improves development efficiency, eases integration, and eliminates additional data mapping and other data maintenance. The Vital AI platform includes Machine Learning, Natural Language Processing, Logical Inference, Graph Analytics, Data Analysis via Spark and Hadoop, Web Crawling, integration with Twitter and Facebook, and large scale semantic data management, available via the Vital Client. The Vital AI Platform is implemented using the Vital AI Service, which consists of 3 primary layers: Vital Prime, Vital Flow, and Spark+Hadoop. Vital Prime provides in-memory data storage and analysis. Vital Flow provides data processing via multi-step data flows which may include steps for text processing, graph analysis, machine learning, logical inference, integration with social networks, and integration with other external APIs. Hadoop provides long term storage of data via HDFS and HBase and analysis via Map/Reduce and Spark jobs.

Vital Prime

Flexible server infrastructure for access to data repositories, running distributed data processing workflows, access to cached data, running real-time analytical scripts.

Vital Flows

Distributed data processing workflows including predictions, natural language processing, graph analytics, and access to external data resources.

Spark+Hadoop Analytics

Large-scale, distributed, parallel, data analytics.


Vital AI provides a low-level Core data model which allows different data processing components to have a common data framework. This Core data model is extended to include objects for common use cases, such as "User", "Document", "Event", and others. This Vital Domain Model is extended to include objects that are required for an application. This becomes the Application Domain Model. For example, if an application will recommend movies to users, then the application may extend the domain model to include objects for "Film", "Actor", and "Genre". VitalSigns provides development tools to define a data model across the entire application and generate objects, which are then used in various components, including the Application's User Interface, Vital Flows, and Spark/Hadoop. This means that the definition of the "User" object is the same in the Application's User Interface, in Vital Flows for recommendations, and in the Spark/Hadoop machine learning jobs. This speeds development and eliminates many problems with data incompatibilities. VitalSigns handles data mapping across components, across different programming languages and data repositories.

Core Data Model

All application models agree on a low-level data model.

Code Generation

All application models use the same data objects defined in the same model.

Descriptive Data Model

The data model captures the meaning of the data, allowing automatic interpretation of the data.

Vital AI Application Platform:
Vital Prime

The Vital Platform uses Vital Prime servers in the implementation. Vital Prime provides in-memory data storage and analysis, interfaces to data repositories, and access to data processing using Vital Flows or Spark/Hadoop. Additionally, Vital Prime implements authentication / authorization for application users. Vital Prime is accessed via a REST interface. Event objects, such as user "clicks", can be sent to Vital Prime via the Queue interface. Event data sent via queue is typically sent into the Data Matrix, which counts events by type and group. Vital Prime processes queries using a connected data repository such as a text index, a triplestore (NoSQL) database, or HBase. Vital Prime provides a scripting interface. Such scripts are called "DataScripts". DataScripts implement application data processing functionality including defining data processing workflows ("Vital Flows"), processing real-time data analytics in the Data Matrix, or accessing the cache and connected data repositories. DataScripts can be run periodically using the JobEngine. DataScripts can be called via the application using the callFunction REST API call. Application functionality is typically implemented via a DataScript, including common features such as returning "top content", "recommended content", or "trending content".

Vital AI Application Platform:
Vital Flow

Vital Flows are used to process data in workflows using different data access and processing components strung together. Vital Flows are distributed over servers using a queue. Vital Flows consist of steps and sub-steps, and a Flow may "call" another Flow, providing generalized data workflows. The NLP Flow Server handles various text processing functions, such as entity extraction, sentiment analysis, and topic categorization. The Graph Analytics Server handles large-scale graph analysis, such as for social network analysis. The Data Flow Server interfaces with data repositories for accessing and storing data. The Integrator Flow Server interacts with external APIs, including Facebook and Twitter APIs. The Inference Flow Server provides a logical inference and rules engine. The Machine Learning Flow Server uses learned models to categorize data. The Spark/Hadoop Flow Server access Spark/Hadoop jobs and data. DataScripts define and run a Vital Flow.

Vital AI Application Platform:

The Vital Platform uses Spark+Hadoop for large-scale data processing, including machine learning analysis. Data stored in HBase is available for analysis using Map/Reduce and Spark jobs, as is data from application events (such as log data), typically written to HDFS. A Vital Flow, initiated via DataScript, may trigger a Map/Reduce or Spark job. The output of a machine learning job is a learned model. The model is loaded into a Vital Flow step which can be used in Vital Flows in categorization.


During application development and maintenance, very significant resources are spent mapping and integrating data, including across different levels of the application architecture. Vital AI's goal with the Vital AI Development Kit (VDK) and the application platform modules is to reduce this resource drain to near zero, freeing up time for deeper data analysis and better applications. By using a common data model and standardized tools, user interfaces, application servers, data repositories, and data analysis tools can draw from the same data model, limiting errors, mismatches, and integration problems.

Vital AI Big Data Application Methodology

(1) Data Modeling

Model your data with description metadata

Version control your data model

Collaborate using your data model across your teams

Generate code from your data model

(2) Data Analysis

Use your data model througout analysis

Define parameters, features, algorithms in your data model

(3) Deployment

Deploy with data model as code artifact

Validate incoming/outgoing data with data model

Use data model in ongoing production data analytics jobs

Vital AI MetaQL

See our presentation from the August 2015 NoSQL Now! for information about MetaQL.

Big Data Modeling

See our presentation from the August 2014 Semantic Technology Conference for information about data models with Big Data.

Creating Intelligent Apps

See our presentation from the October 2013 NYC Semantic Technology Conference for information about developing Intelligent Apps.

Products and Services


Development Kit
Contact Us for Monthly Fee
Build Applications using the VDK Toolset.
Generate your data model with VitalSigns.
Use NoSQL database implementations such as DynamoDB, MongoDB, HBase, and Allegrograph.
Use data analysis modules in Machine Learning and Natural Language Processing.
App Platform
Contact Us for Monthly Fee
Deploy your data model in a scalable application framework.
Deploy data analysis components in parallel.
Easy deployment to cloud infrastructure.
No limitations on number of databases.
App Platform + Source
Contact Us for Monthly Fee
Full access to source code of data analytics components.
Ability to customize platform components.
No limitation on number of databases or applications deployed.

Data Science Services

Vital AI can support your team with expertise in data science and data analytics.

Let us focus our expertise in machine learning, natural language processing, and other forms of data analysis, on your needs.

Software Development Services

Vital AI can provide software development services to implement applications using the Vital AI Development Kit and the Vital AI App Platform.

We can support your team on an hourly or project basis. We have experience in mobile, web, and desktop applications. Let our team help build an Intelligent Application for you.

News and Projects Vital AI Blog
Articles about using the Vital AI Development Kit and Platform are posted on the blog.
School Choice Design Challenge Matching students with high school programs in NYC.
Haley and Demo of Conversational Interface Shopping and Daily-Deal Recommendation Demo


Marc Hadfield
Marc Hadfield loves to create software that understands and reacts to data. He has been a CTO of several start-ups over the past ten years before founding Vital.AI, most recently Inform Technologies. Inform applied Natural Language Processing to the Enterprise Publishing space. Prior to Inform, Marc was CTO of Alitora Systems, which used large scale data analysis to gain insight into disease processes for drug discovery, primarily using Natural Language Processing to understand research articles. In creating many data driven applications, Marc gained experience building out the necessary infrastructure -- this has led to creating the Vital Platform to do the heavy lifting in creating data driven applications.
Dariusz "Derek" Kobylarz
Derek has worked closely with Marc at both Inform and Alitora, implementing both back-end large scale semantic applications and front-end web and mobile user interfaces. Derek can integrate anything with anything, and probably already has. Derek is an expert at the details of the Java Platform and Semantic Web technologies and standards.
Jia Qi Liew
Jia Qi is marketing and design intern at Vital AI. She has former design education at the National University of Singapore with a strong portfolio in branding and marketing. Jia Qi is good at what she does and is always looking for ways to reinvent her skills to serve the big data industry here at Vital AI.

Join Us!

Interested in joining our team? Please contact us!

Contact Information

Office Address:
Vital AI
61 Broadway Suite 1105
New York, NY 10006

Made in New York