How to configure Data Stream main settings
While creating or editing a stream, there are a few specific main configurations you can define and modify.
To find more detailed information on the product and each configuration, see the reference documentation.
In this section, you’ll modify the main settings of your stream: name, data source, template, domains, and destination via Azion Console.
- Access Azion Console > Data Stream.
- From the list, select the stream you want to edit or click the + Stream button.
- Give your stream a unique and easy-to-remember name.
- On the Source dropdown menu, select the one you want to use: Activity History, Edge Applications, Edge Functions, or WAF Events.
- For Edge Functions and WAF Events, you must be subscribed to the products.
- On the Template dropdown menu, select the one you want to use.
- You can modify the presented JSON in the Data Set code box, adding the variables you want to use in your logs’ analysis. See more on How to create a custom template on Data Stream.
- On Domains > Options, select between **All Curent and Future Domains or Filter Domains.
- See more about each option on How to associate domains on Data Stream.
- Under Destination, select a Connector on the dropdown menu: Standard HTTP/HTTPS POST, Apache Kafka, Simples Storage Service (S3), Google BigQuery, Elasticsearch, Splunk, AWS Kinesis Data Firehose, Datadog, IBM QRadar, Azure Monitor, or Azure Blob Storage.
- You’ll see different fields depending on the endpoint type you choose. Find more information on each of them on the Setting an endpoint page.
- Click the Save button.
In this section, you’ll modify the main settings of your stream: name, data source, template, domains, and destination via Real-Time Manager.
- Access Real-Time Manager > Data Stream.
- From the list, select the stream you want to edit or click the Add Streaming button.
- Give your stream a unique and easy-to-remember name.
- On the Data Source dropdown menu, select the one you want to use: Activity History, Edge Applications, Edge Functions, or WAF Events.
- For Edge Functions and WAF Events, you must be subscribed to the products.
- On the Template dropdown menu, select the one you want to use: a pre-set template according to your chosen data source or a Custom Template.
- If you choose Custom Template, add the variables you want to use in your logs’ analysis in the Data Set code box. See more on How to create a custom template on Data Stream.
- On Options, select between Filter Domains or All Domains.
- See more about each option on How to associate domains on Data Stream.
- Under Destination, select an Endpoint Type on the dropdown menu: Standard HTTP/HTTPS POST, Apache Kafka, Simples Storage Service (S3), Google BigQuery, Elasticsearch, Splunk, AWS Kinesis Data Firehose, Datadog, IBM QRadar, Azure Monitor, or Azure Blob Storage.
- You’ll see different fields depending on the endpoint type you choose. Find more information on each of them on the Setting an endpoint page.
- Click the Save button.
For this section, you’ll be modifying an existing stream with an Elasticsearch connector via API. To create a new one, use the Data Stream POST method.
Check the Data Stream Properties table to see all available properties and their accepted values.
- Run the following
GET
request in your terminal, replacing[TOKEN VALUE]
with your personal token to retrieve your<data_streaming_id>
:
- You’ll receive a response with all your existing streams. Copy the value of the
<id>
that you want to configure. - Run a
PATCH
request to modify the stream as follows:
Key | Description |
---|---|
name | Name of the stream |
template_id | Identifier for the template being used |
domain_ids | Array value of the domain’s identifiers (integers) you want to associate with the stream |
data_source | Data source from which you want to stream your data |
endpoint | Endpoint to which you want to stream your data |
all_domains | Used to associate all current and future domains in your account to the stream |
- You’ll receive a response similar to this:
Wait a few minutes for the changes to propagate and your stream will be updated.
Watch a video tutorial on how to use Data Stream on Azion’s YouTube channel: