Last updated
Was this helpful?
Last updated
Was this helpful?
In this guide you'll learn how to visualize data from Apache Pinot using Plotly's web framework. Dash is the most downloaded, trusted Python framework for building ML & data science web apps.
We're going to use Dash to build a real-time dashboard to visualize the changes being made to Wikimedia properties.
Real-Time Dashboard Architecture
We're going to use the following Docker compose file, which spins up instances of Zookeeper, Kafka, along with a Pinot controller, broker, and server:
docker-compose.yml
Run the following command to launch all the components:
Wikimedia provides provides a continuous stream of structured event data describing changes made to various Wikimedia properties. The events are published over HTTP using the Server-Side Events (SSE) Protocol.
We'll need to install the SSE client library to consume this data:
Next, create a file called wiki.py
that contains the following:
wiki.py
The highlighted section shows how we connect to the recent changes feed using the SSE client library.
Let's run this script as shown below:
We'll see the following (truncated) output:
Output
Now we're going to import each of the events into Apache Kafka. First let's create a Kafka topic called wiki_events
with 5 partitions:
Create a new file called wiki_to_kafka.py
and import the following libraries:
wiki_to_kafka.py
Add these functions:
wiki_to_kafka.py
And now let's add the code that calls the recent changes API and imports events into the wiki_events
topic:
wiki_to_kafka.py
The highlighted parts of this script indicate where events are ingested into Kafka and then flushed to disk.
If we run this script:
We'll see a message every time 100 messages are pushed to Kafka, as shown below:
Output
Let's check that the data has made its way into Kafka.
The following command returns the message offset for each partition in the wiki_events
topic:
Output
Looks good. We can also stream all the messages in this topic by running the following command:
Output
Now let's configure Pinot to consume the data from Kafka.
We'll have the following schema:
schema.json
And the following table config:
table.json
The highlighted lines are how we connect Pinot to the Kafka topic that contains the events. Create the schema and table by running the following commnad:
As long as you see some records, everything is working as expected.
Now let's write some more queries against Pinot and display the results in Dash.
First, install the following libraries:
Create a file called dashboard.py
and import libraries and write a header for the page:
app.py
Connect to Pinot and write a query that returns recent changes, along with the users who made the changes, and domains where they were made:
app.py
The highlighted part of the query shows how to count the number of events from the last minute and the minute before that. We then do a similar thing to count the number of unique users and domains.
Now let's create some metrics based on that data.
First, let's create a couple of helper functions for creating these metrics:
dash_utils.py
And now let's add the following import to app.py
:
app.py
And the following code at the end of the file:
app.py
Go back to the terminal and run the following command:
Next, let's add a line chart that shows the number of changes being done to Wikimedia per minute. Update app.py
as follows:
app.py
Go back to the web browser and you should see something like this:
At the moment we need to refresh our web browser to update the metrics and line chart, but it would be much better if that happened automatically. Let's now add auto refresh functionality.
This will require some restructuring of our application so that each component is rendered from a function annotated with a callback that causes the function to be called on an interval.
The app layout now looks like this:
app.py
interval-component
is configured to fire a callback every 1,000 milliseconds.
latest-timestamp
is a container that will contain the latest timestamp.
indicators
will contain indicators with the latest counts of users, domains, and changes.
time-series
will contain the time series line chart.
The timestamp is refreshed by the following callback function:
app.py
The indicators are refreshed by this function:
app.py
And finally, the following function refreshes the line chart:
app.py
If we navigate back to our web browser, we'll see the following:
The full script used in this example is shown below:
dashboard.py
In this guide we've learnt how to publish data into Kafka from Wikimedia's event stream, ingest it from there into Pinot, and finally make sense of the data using SQL queries run from Dash.
You can find the endpoint at:
Once you've done that, navigate to the and run the following query to check that the data has made its way into Pinot:
Navigate to to see the Dash app. You should see something like the following:
Dash Metrics
Dash Time Series
Dash Auto Refresh
In this Apache Pinot guide, we'll learn how visualize data using the Dash web framework.