Zero Mega is a project from Datora Telecom implemented in partner with Microsoft. In this repository you can find all solutions and scripts related to the project.
There are 3 folders:
- Bash Scripts: Here are the scripts in charge to send data from Linux Servers to Azure Event Hub.
- Stream Analytics Scripts: Here are the scripts for processing incoming data in Azure Stream Analytics.
- ZeroMega: The Visual Studio solution with the Zero Mega ASP.NET Web API.
- GraphConsoleApp: Console Application in charge of set extended properties in Azure AD
- SASGenerator: Console Application in order to get SAS tokens for isolated tests.
- ZeroMegaAPI: Main project from this solution, which is an ASP.NET Web API.
Things Expert wanted to create a car tracking solution that was more efficient than the ones already on the market. The target market is car rental companies, as they usually implement car tracking devices to obtain significant discounts from insurance companies. However, existing solutions on the market are based on installing GPS devices on cars, which has two drawbacks: high battery consumption and high usage of data connection (incurring mobile operator costs). Given this scenario, the project team established 4 barriers to be overcome:
- Battery consumption
- High data usage
- Need of scalability to support an initial estimate of 200,000 cars
- Message Integrity, meaning none of the messages sent to the cloud would be lost
In order to achieve that the following architecture for the solution was suggested.
Things Expert had already developed a framework encompassing mobile infrastructure and messaging through their network, but it was not connected to the Cloud. Once we identified their server (Signaling Central) already could send HTTP messages to any destination, we started to draw the cloud connection part of the solution.
A typical message sent from cars is pretty much like this:
{
"anum":"21960101935",
"bnum":"008102980000000180014",
"cgi":"724030041128843",
"id_thing":"724180050340589",
"lat":"-23.5813055556",
"long":"-46.62425"
}
The final archicture for the insertion part is showed right beyond:
For the data query part, the final structure is the following one. We discuss it along this article.
- Create an Web Application in Azure AD and called it ZeroGraph.
- Get the Tenand Name, Tenant ID, Client ID and Client Secret.
- Open GraphConsoleApp project and insert your values in Constants.cs file:
public const string TenantName = "<Your Tenant Name>.onmicrosoft.com";
public const string TenantId = "<Your Tenant ID>";
public const string ClientId = "<Your Client ID>";
public const string ClientSecret = "<Your Client Secret>";
public const string AuthString = "https://login.windows.net/" + TenantName;
public const string ResourceUrl = "https://graph.windows.net";
- Run the project with F5.
- Type and set the desired users’ account.
- Create a Service Bus Namespace
- Create an Event Hub
- Create a Storage Account with General Purpose
- Use Microsoft Azure Storage Emulator and create:
- BLOB Container called idthingsrd
- Tables xDRTable and xDR2LogsTable
- Upload the file Stream Analytics Scripts\2016-08-09-03-22-ref.csv into the idthingsrd container.
- Create a Stream Analytics and set it this way:
- Input 1: Event Hub
- Input 2: Reference data idthingsrd
- Output 1: Table xDRTable
- Output 2: Table xDR2LogsTable
- Query: Copy and paste the following content: (Also available in [Stream Analytics Scripts\query.txt](/Stream Analytics Scripts/query.txt):
--Processing
WITH ProcessedInput AS (
SELECT
CASE
WHEN LEN(id_thing) = 17 THEN CONCAT('90', id_thing)
WHEN LEN(id_thing) = 15 THEN CONCAT('8000', id_thing)
ELSE CONCAT('X', id_thing)
END AS id_thing, System.TimeStamp AS datetime_event, lat AS latitude, long AS longitude, dts AS date_event, tts AS time_event, anum AS numA, bnum AS numB, cgi AS CGI
FROM
Labcom01in
)
--Output: xDR
SELECT
PI.id_thing, PI.datetime_event, Ref.account, PI.latitude, PI.longitude, PI.date_event, PI.time_event, PI.numA, PI.numB, PI.CGI
INTO
xDRout
FROM
ProcessedInput PI
JOIN
Idthingrdin Ref
ON
PI.id_thing = Ref.id_thing
IMEI, IMSI, NumberA, NumberB and CGI are tipically values extracted from a GSM call.
Tipically, IMEI numbers has 17 digits, while IMSI 15 digits. The partner responsible for the design of the solution decided to make a standard 19 digits unique identifier, using the prefix 90 in case an incoming IMEI message, or 8000 if it is an IMSI one. Besides that, for log purpuses, if is not an IMEI or an IMSI case, an X prefix is added.
Your Stream Analytics topology should be similar to this one:
- Create an Web Application in Azure AD and called it ZeroAPI.
- Set the permission to the application could read the directory.
- Get the Tenand Name, APP ID URI, Client ID and Client Secret.
- Create an general purpuse storage account in Azure. Get the connection string.
- Open ZeroMega solution and ZeroMegaAPI project. Inside the Web.config file, replace the following values:
<appSettings>
<add key="ClientValidationEnabled" value="true" />
<add key="UnobtrusiveJavaScriptEnabled" value="true" />
<!--Main App Settings:-->
<add key="ida:Tenant" value="<Your Tenant Name>.onmicrosoft.com"; />
<add key="ida:Audience" value="<Your APP ID URI>" />
<add key="ida:ClientID" value="<Your Client ID>" />
<add key="ida:Password" value="<Your Client Secret>" />
<add key="StorageConnectionString" value="<Your Storage Connection String>" />
</appSettings>
- Copy the ZeroGraph constants settings previously setup for the file Constants.cs in ZeroMegaAPI project.
We decided use a constants file instead of using Web.config file in order to separate the roles for quering Active Directory and for the Web API.
- Run the project with F5 and check if everything is working fine.
- Publish the Web API as an Azure Web App and exposes it for your web services/partners.
The main goal with the project is to make a commercial usage for the API straight to developers. To achieve that, we decided to use Azure API Management to provide an easy interface to control the usage and analytics info. This step is optional if you wouldn't like to add this commercial/control layer.
Basically, you need to:
- Create an API Management environment
- Use your Azure AD tenant as an OAuth authorization server
- Add a reference to Azure API App using Swagger doc file
Steve Danielson has wroten an amazing article showing how to to protect a Web API backend with Azure Active Directory and API Management. You can find it here: https://azure.microsoft.com/en-us/documentation/articles/api-management-howto-protect-backend-with-aad/
As you can see in the IPosition interface, you can query the cars (or other things) positions by ThingId, AccountID and DateTime):
interface IPosition
{
Task<ThingPosition> GetThingPosition(int accountId, string thingId, string datetime);
Task<IEnumerable<ThingPosition>> GetThingPositions(int accountId, string thingId);
Task<IEnumerable<ThingPosition>> GetThingPositions(int accountId, string thingId, DateTime lowerLimit);
Task<IEnumerable<ThingPosition>> GetThingPositions(int accountId, string thingId, DateTime lowerLimit, DateTime upperLimit);
Task<IEnumerable<ThingPosition>> GetAllThingsPositions(int accountId);
}
Besides it shows AccountID as parameter (related to the cars owner), the user cannot queries others cars which don't belongs to him/her. As soon we protected the API using Azure AD and extended its schema for support the AccountID property, we can do the following: Query AD for the AccountID from the Authenticated User and returns just his/her cars.
public class PositionController : ApiController
{
private int _account;
private PositionRepository _repository = new PositionRepository();
//api/Position/
[Route("api/Position")]
public async Task<IEnumerable<ThingPosition>> Get()
{
var user = getUPN();
_account = GetAccountId(user);
return await _repository.GetAllThingsPositions(_account);
}
[...]