The Historian is a series of apps that will extract historical data from into a data store of your choice and in a format that can be easily read by a BI tool such as PowerBI or Tableau.
There are three apps included in this source code:
There is also a mechanism in place for you to be able to add your own data store for the historical data. You will simply need to implement one interface to do so. Instructions on that can be found in the Data Storage section.
You may browse the source code and look at how a DotNet Core application interacts with the API. The following quickstart also assumes that you have the source code locally.
To run the quick extract, follow these instructions:
QuickExtract
folder run the command shown belowtimeseries.csv
dotnet run --host <server.com> --u <Metasys Username> --p <Metasys Password>
-h, --host Required. Base URL <server.com> of the <i class="metasys"></i> Application
-u, --username Required. Username for the <i class="metasys"></i> Application
-p, --password Required. Password for the <i class="metasys"></i> Application
-s, --service (Default: time) Service to extract data from
The discovery app will process the network tree from and determine which API end points should be used for the extraction calls.
This app allows you to specify FQRs for a more “focused” extraction. Providing specific FQRs allows the extractor to gather historical data for only the specified objects instead of samples for every object in the tree.
The Discovery app inserts and converts FQRs into GUIDs, and inserts the EnumSet information into SQLServer. The Discovery app can also run independently from the Extractor app. You may want to re-run the Discovery app when the list of FQRs is updated or when a new object is added to the system.
To run the discovery, follow these instructions:
Discovery
folder run the command shown belowtimeseries.csv
dotnet run --host <server.com> --u <Metasys Username> --p <Metasys Password> [--service time[,audit][,alarm]]
[--dest sqlserver] [--dbconnection "<Database connection string>"] [--fqrs "<FQR full file path>"]
-h, --host Required. Base URL <server.com> of the <i class="metasys"></i> Application
-u, --username Required. Username for the <i class="metasys"></i> Application
-p, --password Required. Password for the <i class="metasys"></i> Application
-s, --service (Default: time) Comma separated list of the service you wish to run ([Time][,Audit][,Alarm]). Minimum of 1 service is required")
-d, --dest (Default: SqlServer) The Destination the data should be saved to ({Csv} | {SqlServer})
-x, --dbconnection Connection string required to connect to the desired DB
-f, --fqrs The absolute path to the file containing the fully qualified references
The extractor app creates jobs and adds tasks to the queue, then process the urls from the task queue. The API endpoints are called and the data is saved to the data store. This app is to be run when you want to get a large set of data or process the data to a CSV file. When you enter in the number of months or days to pull, keep in mind that data can be pulled only up to 3 days in the past.
dotnet run --host <server.com> --u <Metasys Username> --p <Metasys Password> [--service time[,audit][,alarm]]
[--dest sqlserver] [--dbconnection "<Database connection string>"] [--fqrs "<FQR full file path>"] [--M <number of months back>] [--D <Number of Days back>]
-h, --host Required. Base URL <server.com> of the <i class="metasys"></i> Application
-u, --username Required. Username for the <i class="metasys"></i> Application
-p, --password Required. Password for the <i class="metasys"></i> Application
-s, --service (Default: time) Comma separated list of the service you wish to run ([Time][,Audit][,Alarm]). Minimum of 1 service is required")
-d, --dest (Default: SqlServer) The Destination the data should be saved to ({Csv} | {SqlServer})
-x, --dbconnection Connection string required to connect to the desired DB
-f, --fqrs The absolute path to the file containing the fully qualified references
-D, --days (Default: 0) The number of days you wish to query
-M, --month (Default: 0) The number of months you wish to query
All of the concrete classes for the data stores are implementations of the IDataStore
interface. The interface represents saving a single type of historical data (Time Series, Audit, or Alarm) to the data store. Meaning that a separate implementation must be made for each historical data type you are looking to save into the data store.
For example if you wanted to save the Audit and Time Series data, you would need to make two implementations of the IDataStore
: one for the Time Series data and one for the Audit data.
This gives you the flexibility to only implement what you need and the rest will be ignored by the code. Each implementation will also need to pass declare the type for T
as well. The data models in the Models folder should be sufficient for saving the data but can also be modified as needed if more or less data is needed depending on the use case.
Everything having to do with the data storage is in the HistoricalDataFetcher.DataStorage project. The project is broken up into five folders:
Each of the sections are covered in detail below:
The section that stores the implementation of the IDataStore
interface for the Alarms data store.
The section that stores the implementation of the IDataStore
interface for the Audits data store.
The section that declares the IDataStore
interface which is root object that all of the data stores implement.
The section that stores the implementation of the IDataStore
interface for the Time Series data store.
Declares the models that data store is expecting to use when storing the data.
As described above, depending on your needs you can implement the IDataStore
interface for one, two, or all three historical data types. Code elsewhere in the solution will also need to be modified to account for your new changes as well. The following are the steps needed to add a new implementation of IDataStore
:
Controller\Controller.cs
SetDataSaveDestination()
function, you will see a case statement for DestinationSave.Custom
with a TODO:
comment. Set the data store variable(s) in that section to your custom implementation.