# Quick Start: Ingest MQTT Data into EMQX Tables

This guide walks you through creating a complete MQTT-to-database pipeline using EMQX Broker and EMQX Tables, without relying on external databases. EMQX Tables is a fully managed, native time-series database built into the EMQX Cloud Platform, designed for real-time IoT data processing. 

You will learn how to:

- Create EMQX Broker and EMQX Tables deployments
- Ingest MQTT data into EMQX Tables using the data integration
- Query time-series data with SQL using built-in Data Explorer
- Optionally create custom databases and tables

::: tip Note

The screenshots in this document are provided for reference only.

As EMQX Cloud continues to evolve, the Console UI may be updated or improved over time, and some screenshots may not exactly reflect the latest interface. The overall workflows and functionality, however, remain consistent.

:::

## EMQX Tables Free Trial

EMQX Tables offers a free trial for evaluating native MQTT-to-database ingestion and time-series analytics at no cost.

### Quotas and Duration

The EMQX Tables Free Trial includes:

- 14-day free trial period
- 100 GB outbound traffic
- 100 GB storage capacity

### Expiration

- Trial deployments that have had no active connections for 3 consecutive days will be stopped automatically.
- Trial deployments stop immediately when the trial ends.
- Instances are deleted 3 days after expiration.
- All data will be permanently removed after deletion.

To continue using EMQX Tables after the trial, upgrade the deployment to a paid plan before the trial expires.

## Use Case: Smart Factory Monitoring

Let's look at a concrete example of how you can use EMQX Tables in a real-world scenario. In this example, factory devices regularly report time-series telemetry data, including:

- `machine_id`: Device identifier
- `production_line`: Associated production line
- `temperature`: Temperature reading
- `vibration`: Vibration intensity
- `machine_status`: Operational state (e.g., running, warning, error)
- `ts`: Timestamp of the measurement

You want to ingest this data via MQTT and store it natively in EMQX Tables for monitoring, analytics, or alerting.

## Create EMQX Broker and EMQX Tables Deployments

1. Log in to the EMQX Cloud Console.

2. Create or select an existing [project](../feature/project.md#project-management).

3. Under **EMQX Brokers**, click **+ New Deployment**.

4. Select the **Dedicated Flex** plan, and configure the specifications.

   - Select **Cloud Provider & Region** according to your needs.
   - Keep the default value for the following options for quick demonstration.
     - Tier
     - Subscribe to Smart Data Hub (Optional)
     - Deployment Name & Project
     - EMQX Version

   Click **Deploy** in the lower right corner. 

   ![create_broker_deployment](./_assets/create_broker_deployment.png)

   See [Create a Dedicated Flex Deployment](../create/dedicated.md) for more detailed instructions.

5. Under **EMQX Tables**, click **+ New Deployment**.

6. Choose the default **Starter** plan, and select the **same cloud provider and region** as your Broker deployment.

7. (Optional) Under **Network Association**, select the existing network from the dropdown. Since both deployments are on the same cloud platform and in the same region, the Broker's network should appear in the list. Selecting it ensures both deployments share the same network and can communicate over a private connection.

8. Choose a Tier.

9. (Optional) Enter a **Deployment Name**.

10. Click **Deploy**. See [Create an EMQX Tables Deployment](./emqx_tables_create_deployment.md) for more details.

   ![new_table_deployment](./_assets/new_table_deployment.png)

Once the deployments are created, you can enter the deployment by clicking the deployment card in your project. 

Enter your EMQX Tables deployment, click **Data Explorer** from the left menu, and you can see that a default database named `public` is available.

## Private Connectivity Between Broker and Tables

When both deployments are in the same cloud platform and region and are associated with the same network, they can communicate over a secure, low-latency private connection. To share a network, select the existing network under **Network Association** when creating the second deployment.

You can verify this from the project's left menu: click **Network Management** to see the shared network and the deployments associated with it.

> Each network can be shared by at most one EMQX Broker and one EMQX Tables deployment at the same time.

If the Broker and Tables deployments are in different regions or networks, private connectivity is not available. In that case, enable the **NAT Gateway** for your Broker deployment under **Network Management** to allow public internet access to EMQX Tables.

## Create a Rule to Ingest Data

We'll use the rule engine to ingest and persist MQTT messages into EMQX Tables.

1. Go to your Dedicated Flex deployment and navigate to **Data Integration**.

2. If this is your first connector, find and select **EMQX Tables** from the connector list. If you have already created connectors, click **+ New Connector** and then select **EMQX Tables**.

   ![create_connector](./_assets/create_connector.png)

3. Select **Quick Setup** (default) and then select the Tables deployment in your project.

   ![connector_quick_setup](./_assets/connector_quick_setup.png)

4. Click **Test** to verify connectivity. A success prompt should be returned.

5. Click **New** and select **New Rule** to proceed to create a rule using this connector.

6. Define the SQL rule in the **SQL Editor**.

   **Rule SQL Example**:

   ```sql
   SELECT
     timestamp as ts,
     payload.machine_id as machine_id,
     payload.production_line as production_line,
     payload.temperature as temperature,
     payload.vibration as vibration,
     payload.machine_status as machine_status
   FROM "factory/+/metrics"
   ```

   > This rule extracts fields from the payload of all MQTT messages that match the topic  `factory/+/metrics`. Each extracted value is assigned an alias so it can be referenced in the Write Syntax setting in the rule action. The `timestamp` field is mapped to `ts`, which becomes the time index for storing records in EMQX Tables.

7. Click **Next** to append an action to the rule. Configure the Action settings:

   - **Connector**: Select your EMQX Tables connector.

   - **Time Precision**: `millisecond`

   - **Write Syntax**:

     ```text
     machine_metrics,production_line=${production_line},machine_id=${machine_id} temperature=${temperature},vibration=${vibration},machine_status=${machine_status} ${ts}
     ```
     
     >This syntax automatically creates the `machine_metrics` table (if it does not exist) and uses the InfluxDB Line Protocol format to write data into the table.
     >
     >- **Tags**: `production_line`, `machine_id` (used as dimensions and primary key fields)
     >- **Fields**: `temperature`, `vibration`, `machine_status` (actual metric values)
     >- **Timestamp**: `${ts}` ensures time-series alignment using the extracted `timestamp` from the message

8. Click **Confirm** to save the rule. 

   Back on the **Data Integration** page, you can see the created connector, rule, and action.

   ![create_rule](./_assets/create_rule.png)

## Publish MQTT Messages

For quick testing, you can use the built-in diagnostic tool inside your Dedicated Flex deployment by clicking the **Online Test** from the left menu.

1. In the **Online Test**, connect to the deployment using username and password or auto-generated authentication.

2. In the **Messages** section, send two messages:

::: tip

If the table is not created in advance, EMQX Cloud will automatically create it based on the data types of the **first successfully written message**. Once a table and its columns are created, all subsequent writes must use the same data types; otherwise, the write will fail.

:::

   - **Topic**: `factory/A/metrics`

   - **Payload**:

     ```json
     {
       "machine_id": "M001",
       "production_line": "A",
       "temperature": 36.5,
       "vibration": 0.03,
       "machine_status": "running"
     }
     ```

     ```json
     {
       "machine_id": "M002",
       "production_line": "A",
       "temperature": 39.1,
       "vibration": 0.06,
       "machine_status": "warning"
     }
     ```

   ![publish_messages](./_assets/publish_messages.png)

## Query Data in EMQX Tables

1. Go to your **EMQX Tables** deployment.

2. Click **Data Explorer** in the left menu.

3. Enter the following SQL in the editor and click **Run Query**:

   ```sql
   SELECT * FROM machine_metrics;
   ```

   You’ll see the messages ingested.

   ![query_test_data](./_assets/query_test_data.png)

## Complete: Native MQTT-to-DB Pipeline

You now have a live pipeline:

**MQTT Clients -> EMQX Broker -> Rule Engine -> EMQX Tables -> SQL Analytics**

There is no third-party infrastructure, fully managed, and built for time-series IoT workloads.

Next, you can visualize metrics in Grafana or Streamlit. Visit the [Integration Guide](./integration/emqx_tables_integration_guide.md) for more tools and examples. 

## Quick Guide for Using Database Functions

In addition to ingesting data from EMQX Broker into the default `public` database, EMQX Tables also allows you to define custom databases and tables, manually insert data using SQL, and query it using the built-in **Data Explorer** interface. This gives you more flexibility for testing and development.

### Create a Custom Database

You can optionally create a custom database apart from the default `public` database.

1. In your deployment, go to the **Data Explorer** page.

2. Enter the following SQL, and click **Run Query**:

   ```sql
   CREATE DATABASE factory WITH (ttl='7d');
   ```

This creates a new database named `factory` with a data retention time-to-live (TTL) of 7 days.

![create_custom_database](./_assets/create_custom_database.png)

### Create a Table

Inside the new database, you can define a time-series table for storing factory metrics.

In the **Data Explorer**, enter the following SQL in the editor and click **Run Query**:

```sql
CREATE TABLE factory.machine_metrics (
    ts TIMESTAMP NOT NULL,
    production_line STRING,
    machine_id STRING,
    temperature DOUBLE,
    vibration DOUBLE,
    machine_status STRING DEFAULT 'running',
    TIME INDEX (ts),
    PRIMARY KEY (production_line, machine_id)
) WITH (
    ttl='7d'
);
```

This table uses `ts` as the time index and a composite primary key of `production_line` and `machine_id`.

### Insert Data Using SQL

EMQX Tables supports both SQL-based and Line Protocol-based data ingestion. You can also write data by uploading `.txt` and `.lp` files.

In the **Data Explorer**, run the following command to insert sample data:

```sql
INSERT INTO factory.machine_metrics (ts, production_line, machine_id, temperature, vibration, machine_status)
VALUES
    (now(), 'A', 'M001', 36.5, 0.03, 'running'),
    (now(), 'A', 'M002', 39.1, 0.06, 'warning'),
    (now(), 'B', 'M010', 37.2, 0.02, 'running'),
    (now(), 'B', 'M011', 45.6, 0.12, 'error');
```

> `now()` inserts the current timestamp.

<!--

#### Option 1: Insert Data Using SQL

In the **Data Explorer**, run the following command to insert sample data:

```sql
INSERT INTO factory.machine_metrics (ts, production_line, machine_id, temperature, vibration, machine_status)
VALUES
    (now(), 'A', 'M001', 36.5, 0.03, 'running'),
    (now(), 'A', 'M002', 39.1, 0.06, 'warning'),
    (now(), 'B', 'M010', 37.2, 0.02, 'running'),
    (now(), 'B', 'M011', 45.6, 0.12, 'error');
```

> `now()` inserts the current timestamp.-->

<!--

#### Option 2: Insert Data Using InfluxDB Line Protocol

1. Navigate to **Import Data**.

2. Paste the following line protocol into the editor:

   machine_metrics,production_line=A,machine_id=M001 temperature=36.5,vibration=0.03,status="running"
   machine_metrics,production_line=A,machine_id=M002 temperature=39.1,vibration=0.06,status="warning"
   machine_metrics,production_line=B,machine_id=M010 temperature=37.2,vibration=0.02,status="running"
   machine_metrics,production_line=B,machine_id=M011 temperature=45.6,vibration=0.12,status="error"

3. Choose a precision for timestamp (e.g. `Nanoseconds`).

4. Click **Write** to import the data.

This method automatically maps tags to primary keys and fields to metric values.

Line Protocol mappings:

- Measurement → table name  
- Tags -> dimensions (e.g., `production_line`, `machine_id`)  
- Fields -> metrics (e.g., `temperature`, `vibration`, `status`)  
- Timestamp → optional (defaults to current time)
-->

<!--

#### Option 3: Upload `.txt` or `.lp` Files

1. Click the upload icon above the editor in the **Import Data** page.

2. Download the `.lp` template.

3. Edit the file with your Line Protocol entries.

4. Upload the file (`.txt` or `.lp`, up to 10 MB).

5. Click **Import**.

   -->

### Query Your Data

To verify your data, use the **Data Explorer**.

#### Example Query

View all records:

```sql
SELECT * FROM factory.machine_metrics;
```

![query_all_records](./_assets/query_all_records.png)

View average temperature in the last 60 minutes, grouped by line and status:

```sql
SELECT production_line, machine_status, AVG(temperature) AS avg_temp
FROM factory.machine_metrics
WHERE ts > now() - INTERVAL '60 minute'
GROUP BY production_line, machine_status;
```

![query_by_time_limits](./_assets/query_by_time_limits.png)

You can also filter by device:

```sql
SELECT ts, temperature
FROM machine_metrics
WHERE machine_id = 'M001'
ORDER BY ts DESC
LIMIT 10;
```

![query_filter_by_device](./_assets/query_filter_by_device.png)

### Use Quick Query for Fast Access

You can generate SQL snippets quickly from the table schema:

1. Hover over a column in the left-side schema panel under **Data Explorer**.
2. Click the **vertical ellipsis (︙)** icon beside a column.
3. Select **Quick Query**:
   - Use options like **Query column**, **Query max**, or **Query min**.
4. The generated SQL will appear in the editor automatically.

![quick_query](./_assets/quick_query.png)

## Resources

Visit the Greptime document for supported  [SQL statements and clauses](https://docs.greptime.com/reference/sql/overview/).

For more detailed instructions on how to query data, refer to [Query Data in EMQX Tables](./emqx_tables_query_guide.md).

