This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .
Real-time data analytics is the process of collecting, analyzing, and using data in real time to make informed decisions. It involves capturing data as it is generated, processing it immediately, and presenting actionable insights to users without any delay. Real-time data analytics can help you improve customer satisfaction, business intelligence, business development, fraud detection, and efficiency.
In this blog, you will learn how to use Azure Event Hub, Microsoft Fabric, and Power BI to create a scalable and efficient solution for real-time data. You will also see how to use a Python script to simulate a device that sends data to Azure Event Hub every time you run it.
Prerequisites:
- Azure account
- Microsoft Fabric licence
- Visual studio code
- Python
1. Create an event hub namespace
Create a resource group: A resource group is a logical collection of Azure resources.
- Sign in to the Azure portal.
- In the left navigation, select Resource groups, and then select Create.
- For Subscription, select the name of the Azure subscription in which you want to create the resource group.
- Type a unique name for the resource group. The system immediately checks to see if the name is available in the currently selected Azure subscription.
- Select a region for the resource group.
- Select Review + Create. On the Review + Create page, select Create.
2. Create an Event Hubs namespace
An Event Hubs namespace provides a unique scoping container, in which you create one or more event hubs.
In the Azure portal,
- in Marketplace type event hubs
- Select Create on the toolbar.
- On the Create namespace page, take the following steps:
- Select the subscription in which you want to create the namespace.
- Select the resource group you created in the previous step.
- Enter a name for the namespace. The system immediately checks to see if the name is available.
- Select a location for the namespace.
- Choose Basic for the pricing tier.
- Leave the throughput units (for standard tier) or processing units (for premium tier) settings as it is.
Select Review + Create at the bottom of the page. On the Review + Create page, review the settings, and select Create.
3. Create the first Event Hubs namespace instance
To create an event hub within the namespace, do the following actions:
On the Overview page, select + Event hub on the command bar.
- Type a name for your event hub, then select Review + create.
- You can check the status of the event hub creation in alerts, after the event hub is created.
4. Get data from Azure Event Hubs
4.1. Set a shared access policy on your event hub
Before you can create a connection to your Event Hubs data, you need to set a shared access policy (SAS) on the event hub and collect some information to be used later in setting up the connection.
In the Azure portal, browse to the event hubs instance you want to connect.
- Under Settings, select Shared access policies
- Select +Add to add a new SAS policy, or select an existing policy with Manage permissions.
- Enter a Policy name.
- Select Manage, and then Create.
4.2.Gather information for the cloud connection
Within the SAS policy pane, take note of the following four fields. You might want to copy these fields and paste it somewhere, like a notepad, to use in a later step.
Field reference |
Field |
Description |
Example |
a |
Event Hubs instance |
The name of the event hub instance. |
learnfabric |
b |
SAS Policy |
The SAS policy name created in the previous step |
learnfabricSAP |
c |
Primary key |
The key associated with the SAS policy |
xxxxxxxxxxxxxxxxx |
4.3. Create consume group
To create a new Consumer Group in Azure, navigate to the Azure Portal, then go to Event Hubs, select your specific Event Hub, click on Consumer Groups, and finally click on “Consumer Group” to add a new one.
5.Create a workspace
To create a workspace within the Microsoft Fabric portal, do the following actions:
- Sign in to Power BI.
- Select Workspaces > New workspace.
- Fill out the Create a workspace form as follows: Name: Enter Data Warehouse Tutorial, and some characters for uniqueness. ...
- Expand the Advanced section.
- Choose Fabric capacity or Trial in the License mode section.
- Choose a premium capacity you have access to.
- Select Apply. The workspace is created and opened.
5.1.KQL Database
Now you need to create a KQL Database, here are the steps:
- Ensure you have a workspace with a Microsoft Fabric-enabled capacity.
- Select New > KQL Database.
- Enter your database name, then select Create.
5.2.Database details
The main page of your KQL database shows an overview of the contents in your database but now it is empty.
5.3.Create eventstream
On the Workspace page, select New and then Eventstream:
- Enter a name for the new eventstream and select Create.
The creation of the new eventstream in your workspace may take a few seconds. Once it’s done, you're directed to the main editor where you can add sources and destinations to your eventstream.
5.3.1. Source
- On the lower ribbon of your KQL database, select Get Data.
- In the Get data window, the Source tab is selected.
- Select the data source from the available list. In this example, you're ingesting data from Azure Event Hubs.
5.3.2. Configure
- Add source name for your choice.
- select Create new connection or select Existing connection. For this example you will create a new connection
- Run the script through your terminal to collect the data from your device to Azure Event:
- python sender.
- python sender.
-
The link for the script PascalBurume/Event-Hub (github.com)
-
Create new connection
- Fill out the Connection settings according to the following table:
Setting
Description
Example value
Event hub namespace
Field d from the table above.
eventdatacollect
Event hub
Field a from the table above. The name of the event hub instance.
learnfabric
Connection
To use an existing cloud connection between Fabric and Event Hubs, select the name of this connection. Otherwise, select Create new connection.
Create new connection
Connection name
The name of your new cloud connection. This name is autogenerated but can be overwritten. Must be unique within the Fabric tenant.
Connection
Authentication kind
Auto populated. Currently only Shared Access Key is supported.
Shared Access Key
Shared Access Key Name
Field b from the table above. The name you gave to the shared access policy.
learnfabricSAP
- Fill out the Connection settings according to the following table:
- Select Save. A new cloud data connection between Fabric and Event Hubs is created.
-
Connect the cloud connection to your Event Stream
Whether you have created a new cloud connection, or you're using an existing one, you need to define the consumer group.
- Fill out the following fields according to the table:
Setting
Description
Example value
Destination name
The name of the destination for your choice
Data_collection
Consumer group
The relevant consumer group defined in your event hub. After adding a new consumer group, you'll then need to select this group from the drop-down.
Data
Workspace
The workspace where your database will be located.
RealTime_event
KQL Database
The name of your database
Event_Kdb
Destination table
The name of your table where your data will be store
Data_
Input data format
Select the format of the incoming data you want to ingest. Data from the eventstream that matches the selected format will be ingested into the kusto.
JSON
- Fill out the following fields according to the table:
- Select Save.
5.3.3.Datbase details
The main page of your KQL database shows an overview of the contents in your database. The following table lists the available information.
Card |
Item |
Description |
Database details |
||
Created by |
User name of person who created the database. |
|
Region |
Shows the region of the data and services. |
|
Created on |
Date of database creation. |
|
Last ingestion |
Date on which data was ingested last into the database. |
|
Query URI |
URI that can be used to run queries or to store management commands. |
|
Ingestion URI |
URI that can be used to get data. |
|
OneLake folder |
OneLake folder path that can be used for creating shortcuts. You can also activate and deactivate data copy to OneLake. |
|
Size |
||
Compressed |
Total size of compressed data. |
|
Original size |
Total size of uncompressed data. |
|
Compression ratio |
Compression ratio of the data. |
|
Top tables |
||
Name |
Lists the names of tables in your database. Select a table to see more information. |
|
Size |
Database size in megabytes. The tables are listed in a descending order according to the data size. |
|
Most active users |
||
Name |
User name of most active users in the database. |
|
Queries run last month |
The number of queries run per user in the last month. |
|
Recently updated functions |
||
Lists the function name and the time it was last updated. |
||
Recently used Querysets |
||
Lists the recently used KQL queryset and the time it was last accessed. |
||
Recently created Data streams |
||
Lists the data stream and the time it was created. |
6.Create a report
There are three possible ways to create a report:
- Open the Explore your data window from a KQL database.
On the ribbon, select Build Power BI report.
.
6.1.Report preview
You can add visualizations in the report's preview. In the Data pane, expand Kusto Query Result to see a summary of your query
When you're satisfied with the visualizations, select File on the ribbon, and then Save this report to name and save your report in a workspace.
6.2 Report details
- In Name your file in Power BI, give your file a name.
- Select the workspace in which to save this report. The report can be a different workspace than the one you started in.
- Select the sensitivity label to apply to the report
- Select Continue.
6.3.Build your report.
You can build the report with visuals: Line Charts and Table to view the information about the real time data.
You can run the script again for collecting data from device from the terminal
- python sender.py
After refreshing the report, we can see how the visuals has also been update.
We can observe how the data has been updated.
Congratulations on completing this tutorial blog on how to leverage Azure Event Hub, Microsoft Fabric, and Power BI for real-time data analytics. You have learned how to create a scalable and reliable data pipeline that can ingest, process, and visualize streaming data from devices. You have also gained valuable skills and insights on how to use Microsoft's powerful cloud services and tools to enhance your data analytics capabilities. We hope you enjoyed this tutorial and found it useful for your projects.
Thank you for choosing Microsoft as your partner in data analytics.
Ressoruces :
Overview of Real-Time Analytics - Microsoft Fabric | Microsoft Learn
Get data from Azure Event Hubs - Microsoft Fabric | Microsoft Learn
Quickstart: Read Azure Event Hubs captured data (Python) - Azure Event Hubs | Microsoft Learn
Visualize data in a Power BI report - Microsoft Fabric | Microsoft Learn
CareerHubPage - Microsoft Fabric Community
Get started with Microsoft Fabric - Training | Microsoft Learn