This document explores how to use the Ingest Wizard to ingest data directly from the user interface. Note that you can also ingest data through the Interana CLI, which is covered in a separate topic.
Accessing the Ingest Wizard
Navigate to the Ingest Wizard at https://<cluster_location>/?import.
Prerequisite: You must be granted admin role in order to configure ingest on your system.
Use case #1: ingest one file
In its simplest flow, the Ingest Wizard allows you to (1) name your event table, (2) choose a sample file, (3) choose your time and actor columns, and (4) ingest that sample file. Read the How To entitled Ingest One JSON File Using the Ingest Wizard for a step-by-step walkthrough.
For a list of supported file types, see the Ingest File Types and Formats Reference.
Use case #2: transform your data
It's pretty common to look at the preview of your data and realize that your data isn't making it into Interana quite the way you hoped. Sometimes that's because your data needs some cleansing, and other times it's because your logging format needs some transformation before it can be read naturally by Interana. In either case, you can use Interana's Transformer Library to do lightweight transformation and cleansing of your data.
For a reference listing of all transformers available in the system, see Transformer Library Steps and Examples, and for great how-to example, check out Use Transformer Library to Transform Data Before Ingest.
Use case #3: add a Lookup table
Once you have your event data looking good in Interana, you can ingest one or more lookup tables related to your event table. Lookup tables are not event data; they don't have a timestamp, and instead represent static contextual information that is attached to a particular column of your event data.
Use case #4: run a batch job
Once you have proven out that your data is structured the way you expect and ingests cleanly into Interana, you'll typically want to set up a batch ingest job. This can be a continuous job, scanning for new data as it is generated, or a one-time job, to backfill existing data.
Use case #5: ingest from multiple data sources into one table
Once you have configured your table and set up a batch job to continuously load data into it, you may discover that you have additional data sources that you'd like to load into the same table. To achieve this, you can choose to "add data to an existing table" on the first page of the wizard.
In this scenario, you cannot modify the time and actor columns since they were already configured when you created the table. In fact, you will likely need to apply transformations to your new data source so that its fields (especially time and actor columns) map cleanly into the ones expected by your existing table.