The best way to synchronize AWS IoT SiteWise belongings and knowledge throughout AWS accounts

Introduction

As industrial and manufacturing firms embark on their digital transformation journey, they need to leverage superior applied sciences for elevated effectivity, productiveness, high quality management, flexibility, value discount, provide chain optimization, and aggressive benefit within the quickly evolving digital period. AWS prospects within the manufacturing and industrial area, more and more leverage AWS IoT SiteWise to modernize their industrial knowledge technique and unlock the total potential of their operational expertise. AWS IoT SiteWise empowers you to effectively acquire, retailer, set up, and monitor knowledge from industrial tools at scale.It additionally lets you derive actionable insights, optimize operations, and drive innovation via data-driven selections.

The journey typically begins with a Proof of Worth (PoV) case examine in a improvement surroundings. This method offers you with a chance to discover how knowledge assortment and asset modelling with an answer that features AWS IoT SiteWise can assist. As you turn into comfy with the answer, you can scale extra belongings or services right into a manufacturing surroundings from staging over time. This weblog publish offers an outline of the structure and pattern code emigrate the belongings and knowledge in AWS IoT SiteWise from one deployment to a different, whereas guaranteeing knowledge integrity and minimizing operational overhead.

Getting began with AWS IoT SiteWise

Through the PoV part, you determine knowledge ingestion pipelines to stream close to real-time sensor knowledge from on-premises knowledge historians, or OPC-UA servers, into AWS IoT SiteWise. You may create asset fashions that digitally symbolize your industrial tools to seize the asset hierarchy and demanding metadata inside a single facility or throughout a number of services. AWS IoT SiteWise offers API operations that can assist you import your asset mannequin knowledge (metadata) from various techniques in bulk, comparable to course of historians in AWS IoT SiteWise at scale. Moreover, you’ll be able to outline widespread industrial efficiency indicators (KPIs) utilizing the built-in library of operators and features out there in AWS IoT SiteWise. You may also create customized metrics which can be triggered by tools knowledge on arrival or computed at user-defined intervals.

Establishing a number of non-production environments on a manufacturing facility ground might be difficult on account of legacy networking and strict laws related to the plant ground – along with delays in {hardware} procurement. Many purchasers transition the identical {hardware} from non-production to manufacturing by designating and certifying the {hardware} for manufacturing use after validation completes.

To speed up and streamline the deployment course of, you want a well-defined method emigrate their IoT SiteWise sources (asset, hierarchies, metrics, transforms, time-series, and metadata) between AWS accounts as a part of your commonplace DevOps practices.

AWS IoT SiteWise shops knowledge throughout storage tiers that may help coaching machine studying (ML) fashions or historic knowledge evaluation in manufacturing. By way of this blogpost we offer an overview about methods to migrate the asset fashions, asset hierarchies, and historic time collection knowledge from the event surroundings to the staging and manufacturing environments which can be hosted on AWS.

Answer Walkthrough

Let’s start by discussing the technical points of migrating AWS IoT SiteWise sources and knowledge between AWS accounts. We offer a step-by-step information on methods to export and import asset fashions and hierarchies utilizing IoT SiteWise APIs. We additionally talk about methods to switch historic time collection knowledge utilizing Amazon Easy Storage Service (Amazon S3) and the AWS IoT SiteWise BatchPutAssetPropertyValue API operation.

By following this method, you’ll be able to promote your AWS IoT SiteWise setup and knowledge via the event lifecycle as you scale your industrial IoT purposes into manufacturing. The next is an outline of the method:

  1.  AWS IoT Sitewise metadata switch:
    1.  Export AWS IoT SiteWise fashions and belongings from one AWS account (improvement account) by working a bulk export job. You should utilize filters to export the fashions and/or belongings.
    2.  Import the exported fashions and/or belongings right into a second AWS account (staging account) by working a bulk import job. The import recordsdata should observe the AWS IoT SiteWise metadata switch job schema.
  2. AWS IoT Sitewise telemetry knowledge switch
    1. Use the next API operations emigrate telemetry knowledge throughout accounts:
      1. BatchGetAssetPropertyValueHistory retrieves historic telemetry knowledge from the improvement account.
      2. CreateBulkImportJob ingests the retrieved telemetry knowledge into the staging account.

The information migration steps in our resolution make the next assumptions:

  1. The staging account doesn’t have AWS IoT SiteWise belongings or fashions configured the place it makes use of the identical title or hierarchy because the improvement account.
  2. You’ll replicate the AWS IoT SiteWise metadata from the improvement account to the staging account.
  3. You’ll transfer the AWS IoT SiteWise telemetry knowledge from the improvement account to the staging account.

1: Migrate AWS IoT SiteWise fashions and belongings throughout AWS accounts

Figure 1: Architecture to migrate AWS IoT SiteWise metadata across AWS accounts

Determine 1: Structure emigrate AWS IoT SiteWise metadata throughout AWS accounts

AWS IoT SiteWise helps bulk operations with belongings and fashions. The metadata bulk operations assist to:

  1.  Export AWS IoT SiteWise fashions and belongings from the improvement account by working a bulk export job. You may select what to export while you configure this job. For extra data, see Export metadata examples.
    1.  Export all belongings and asset fashions, and filter your belongings and asset fashions.
    2. Export belongings and filter your belongings.
    3. Export asset fashions and filter your asset fashions.
  2. Import AWS IoT SiteWise fashions and belongings into the staging account by working a bulk import job. Just like the export job, you’ll be able to select what to i­­mport. For extra data, see Import metadata examples.
    1. The import recordsdata observe a selected format. For extra data, see AWS IoT SiteWise metadata switch job schema.

2: Migrate AWS IoT SiteWise telemetry knowledge throughout AWS accounts

AWS IoT SiteWise helps ingesting excessive quantity historic knowledge utilizing the CreateBulkImportJob API operation emigrate telemetry knowledge from the improvement account to the staging account.

Figure 2: Architecture to migrate AWS IoT SiteWise telemetry data across AWS accounts

Determine 2: Structure emigrate AWS IoT SiteWise telemetry knowledge throughout AWS accounts

2.1 Retrieve knowledge from the improvement account utilizing BatchGetAssetPropertyValueHistory

AWS IoT SiteWise has knowledge and SQL API operations to retrieve telemetry outcomes. You should utilize the export file from the Export AWS IoT SiteWise fashions and belongings by working a bulk export job step to get a listing of AWS IoT SiteWise asset IDs and property IDs to question utilizing the BatchGetAssetPropertyValueHistory API operation. The next pattern code demonstrates retrieving knowledge for the final two days:

import boto3
import csv
import time
import uuid
"""
Connect with the IoT SiteWise API and outline the belongings and properties 
to retrieve knowledge for.
"""
sitewise = boto3.shopper('iotsitewise')
# restrict for under 10 AssetIds/PropertyIDs/EntryIDs per API name
asset_ids = ['a1','a2','a3'] 
property_ids = ['b1','b2','b3']

"""
Get the beginning and finish timestamps for the date vary of historic knowledge
to retrieve. At the moment set to the final 2 days.
""" 
# Convert present time to Unix timestamp (seconds since epoch)
end_time = int(time.time()) 
# Begin date 2 days in the past
start_time = end_time - 2*24*60*60
"""
Generate a listing of entries to retrieve property worth historical past.
Loops via the asset_ids and property_ids lists, zipping them 
collectively to generate a singular entry for every asset-property pair.
Every entry accommodates a UUID for the entryId, the corresponding 
assetId and propertyId, and the beginning and finish timestamps for 
the date vary of historic knowledge.
"""
entries = []
for asset_id, property_id in zip(asset_ids, property_ids):
  entry = {
    'entryId': str(uuid.uuid4()),
    'assetId': asset_id, 
    'propertyId': property_id,
    'startDate': start_time,
    'endDate': end_time,
    'qualities': [ "GOOD" ],
  }
  entries.append(entry)
"""
Generate entries dictionary to map entry IDs to the total entry knowledge 
for retrieving property values by entry ID.
"""
entries_dict = {entry['entryId']: entry for entry in entries}
"""
The snippet beneath retrieves asset property worth historical past from AWS IoT SiteWise utilizing the
`batch_get_asset_property_value_history` API name. The retrieved knowledge is then
processed and written to a CSV file named 'values.csv'.
The script handles pagination by utilizing the `nextToken` parameter to fetch
subsequent pages of information. As soon as all knowledge has been retrieved, the script
exits the loop and closes the CSV file.
"""
token = None
with open('values.csv', 'w') as f:
  author = csv.author(f)
  whereas True:
    """
    Make API name, passing entries and token if on subsequent name.
    """
    if not token:
      property_history = sitewise.batch_get_asset_property_value_history(
          entries=entries
      )
    else:
      property_history = sitewise.batch_get_asset_property_value_history(
          entries=entries,
          nextToken=token
      )
    """
    Course of success entries, extracting values into a listing of dicts.
    """
    for entry in property_history['successEntries']:
        entry_id = entry['entryId']
        asset_id = entries_dict[entry_id]['assetId']
        property_id = entries_dict[entry_id]['propertyId']
        for history_values in entry['assetPropertyValueHistory']:
          value_dict = history_values.get('worth')
          values_dict = {
            'ASSET_ID': asset_id,
            'PROPERTY_ID': property_id,
            'DATA_TYPE': str(listing(value_dict.keys())[0]).higher().exchange("VALUE", ""),
            'TIMESTAMP_SECONDS': history_values['timestamp']['timeInSeconds'],
            'TIMESTAMP_NANO_OFFSET': history_values['timestamp']['offsetInNanos'],
            'QUALITY': 'GOOD',
            'VALUE': value_dict[list(value_dict.keys())[0]],
          }
          author.writerow(listing(values_dict.values()))
    """
    Examine for subsequent token and break when pagination is full.
    """  
    if 'nextToken' in property_history:
      token = property_history['nextToken']
    else:
      break

2.2 Ingest knowledge to the staging account utilizing CreateBulkImportJob

Use the values.csv file to import knowledge into AWS IoT SiteWise utilizing the CreateBulkImportJob API operation. Outline the next parameters when you create an import job utilizing CreateBulkImportJob. For a code pattern, see CreateBulkImportJob within the AWS documentation.

  1. Substitute the adaptive-ingestion-flag with true or false. For this train, set the worth to true.
    1. By setting the worth to true, the majority import job does the next:
      1. Ingests new knowledge into AWS IoT SiteWise.
      2. Calculates metrics and transforms, and helps notifications for knowledge with a time stamp that’s inside seven days.
    2.  In the event you had been to set the worth to false, the majority import job ingests historic knowledge into AWS IoT SiteWise.
  2. Substitute the delete-files-after-import-flag with true to delete the info from the Amazon S3 knowledge bucket after ingesting into AWS IoT SiteWise heat tier storage. For extra data, see Create a bulk import job (AWS CLI).

Clear Up

After you validate the leads to the staging account, you’ll be able to delete the info from the improvement account utilizing AWS IoT SiteWise DeleteAsset and DeleteAssetModel API operations. Alternatively, chances are you’ll proceed to make use of the improvement account to proceed different improvement and testing actions with the historic knowledge.

Conclusion

On this weblog publish, we addressed the problem industrial prospects face when scaling their AWS IoT SiteWise deployments. We mentioned transferring from PoV to manufacturing throughout a number of vegetation and manufacturing traces and the way AWS IoT SiteWise addresses these challenges. Migrating metadata (comparable to asset fashions, asset/enterprise hierarchies, and historic telemetry knowledge) between AWS accounts ensures constant knowledge context. It additionally helps selling Industrial IoT belongings and knowledge via the event lifecycle. For extra particulars please see Bulk operations with belongings and fashions.

Creator biographies

JoysonLewis.jpg

Joyson Neville Lewis

Joyson Neville Lewis is a Sr. IoT Information Architect with AWS Skilled Providers. Joyson labored as a Software program/Information engineer earlier than diving into the Conversational AI and Industrial IoT area. He assists AWS prospects to materialize their AI visions utilizing Voice Assistant/Chatbot and IoT options.

Anish Kunduru.jpg

Anish Kunduru

Anish Kunduru is an IoT Information Architect with AWS Skilled Providers. Anish leverages his background in stream processing, R&D, and Industrial IoT to help AWS prospects scale prototypes to production-ready software program.

Ashok Padmanabhan.jpg

Ashok Padmanabhan

Ashok Padmanabhan is a Sr. IoT Information Architect with AWS Skilled Providers. Ashok primarily works with Manufacturing and Automotive to design and construct Trade 4.0 options.

Avik Ghosh.jpg

Avik Ghosh

Avik is a Senior Product Supervisor on the AWS Industrial IoT crew, specializing in the AWS IoT SiteWise service. With over 18 years of expertise in expertise innovation and product supply, he focuses on Industrial IoT, MES, Historian, and large-scale Trade 4.0 options. Avik contributes to the conceptualization, analysis, definition, and validation of AWS IoT service choices.

Leave a Reply

Your email address will not be published. Required fields are marked *