<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: philbasford</title>
    <description>The latest articles on Forem by philbasford (@philbasford).</description>
    <link>https://forem.com/philbasford</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/philbasford"/>
    <language>en</language>
    <item>
      <title>AWS re:invent 2022 wish list</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Wed, 23 Nov 2022 21:59:56 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-reinvent-2022-wish-list-bff</link>
      <guid>https://forem.com/aws-builders/aws-reinvent-2022-wish-list-bff</guid>
      <description>&lt;p&gt;Just before it is wheels up and hopping over the Atlantic Ocean to the US for AWS re:invent. I wanted to do a quick blog containing my wish list. So here it is:&lt;/p&gt;

&lt;h1&gt;
  
  
  More features in SageMaker Serverless
&lt;/h1&gt;

&lt;p&gt;SageMaker Serverless went GA in April 2022, having been announced at last years re:invent. However, it has the following limitations:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;GPUs, AWS marketplace model packages, private Docker registries, Multi-Model Endpoints, VPC configuration, network isolation, data capture, multiple production variants, Model Monitor, and inference pipelines.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;These for me are pretty much showstoppers for any situation I wish to use SageMaker Serverless, so I revert back to Lambda and do things the long way. OK off course I know GPU is still an issue with Lambda but please AWS can have these features for SageMaker Serverless.&lt;/p&gt;

&lt;h1&gt;
  
  
  Managed MLFlow
&lt;/h1&gt;

&lt;p&gt;I am seeing growing demand for MLflow (&lt;a href="https://mlflow.org/" rel="noopener noreferrer"&gt;https://mlflow.org/&lt;/a&gt;) and I am seeing a lot of people looking at Databricks as commercial offering for MLflow. Alternatively, some popele are implementing something like &lt;a href="//file:///Users/phil@inawisdom.com/Desktop/Managing%20your%20Machine%20Learning%20lifecycle%20with%20MLflow"&gt;Managing your Machine Learning lifecycle with MLflow&lt;/a&gt;. Therefore, I think this was on my wish list last year, but I really hope AWS announce a Managed MLFlow Service. I know version 2.X is too new but at least 1.X would be great start.&lt;/p&gt;

&lt;h1&gt;
  
  
  Public CodeCommit
&lt;/h1&gt;

&lt;p&gt;I used CodeCommit every day and it offers basic GIT functionality for private repositories. However, I wish to share anything with the public then I need to use my GIT Hub account as a second remote and remember to push too both. It would be awesome for AWS to offer public repositories that I can push my open-source projects too.&lt;/p&gt;

&lt;h1&gt;
  
  
  3rd party integration in StepFunctions
&lt;/h1&gt;

&lt;p&gt;This year I have got into Apache Airflow and I wrote a blog &lt;a href="//file:///Users/phil@inawisdom.com/Desktop/Managing%20your%20Machine%20Learning%20lifecycle%20with%20MLflow"&gt;Running Thousands of Models a month with Apache AirFlow&lt;/a&gt;. For me the main reason you would choose Apache Airflow over StepFunctions is the type of integrations you require. If you’re mainly using a lots of Open Source or Commercial offerings, either SaaS or PaaS, then your best using Apache Airflow. However, if you’re going mainly AWS Cloud Native then your use StepFunctions. Maybe with the odd Lambda for any non-AWS integrations. It would be awesome for AWS offer away, like for the boto3 SDK, to offer a way to use 3rd Party SDKs&lt;/p&gt;

&lt;h1&gt;
  
  
  StepFunctions Local Dev
&lt;/h1&gt;

&lt;p&gt;I know I am not the only that fines this tedious but testing step functions locally with local integrations is hard work. It also not easily to integration test a complete serverless app. If you really do have time, you can build something with testcontainers and docker compose to spin up various AWS lock offerings:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://hub.docker.com/r/amazon/aws-stepfunctions-local" rel="noopener noreferrer"&gt;https://hub.docker.com/r/amazon/aws-stepfunctions-local&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-local-start-lambda.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-cli-command-reference-sam-local-start-lambda.html&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://github.com/awslabs/amazon-ecs-local-container-endpoints" rel="noopener noreferrer"&gt;https://github.com/awslabs/amazon-ecs-local-container-endpoints&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://hub.docker.com/r/amazon/dynamodb-local" rel="noopener noreferrer"&gt;https://hub.docker.com/r/amazon/dynamodb-local&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Please AWS I would love a complete local serverless development environment for the main serverless services.&lt;/p&gt;

&lt;h1&gt;
  
  
  Cross account param store replication
&lt;/h1&gt;

&lt;p&gt;Lastly my final wish is replication of parameters in Parameter store between accounts. I&lt;br&gt;
have recently built some MLOPS pipelines where the S3 buckets and Dynamo DB tables&lt;br&gt;
exist in one account and they are used by other accounts. I want the name of the buckets and tables, that are created dynamically by CloudFormation to be shared via Parameter store with other accounts. This would mean other Cloud Formation in the other accounts could just reference the Parameter&lt;/p&gt;

&lt;p&gt;I hope you like the list and you want to meet at re:invent, contact me via PeerTalk&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>learning</category>
    </item>
    <item>
      <title>re:Invent 2022 Sessions</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Tue, 11 Oct 2022 09:39:55 +0000</pubDate>
      <link>https://forem.com/philbasford/reinvent-2022-sessions-57no</link>
      <guid>https://forem.com/philbasford/reinvent-2022-sessions-57no</guid>
      <description>&lt;p&gt;This year I will be heading back to Las Vegas for re:Invent 2022. My third re:Invent and for first time post Pandemic. I have looked over the session catalog and here my picks for 2022.&lt;/p&gt;

&lt;h1&gt;
  
  
  Community
&lt;/h1&gt;

&lt;p&gt;The following are some sessions that follow AWS Ambassadors, Heros, and Community Builders are presenting at:&lt;/p&gt;

&lt;h3&gt;
  
  
  Matt Houghton: Migrating 600 databases to AWS and making them better, faster, and cheaper
&lt;/h3&gt;

&lt;p&gt;COM311&lt;br&gt;
This dev chat covers the entire journey of a real-life large-scale database migration from Oracle on premises to PostgreSQL on AWS. Learn about the process and unique details, issues, and tips learned from zero-downtime database migration, change data capture, and schema conversion. Find out how the migration not only made the databases cheaper by eliminating licensing costs, but also sped up development and helped the application run faster. Finally, find out about the missing features the migration team needed for PostgreSQL and how they open-sourced them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Thursday, December 1&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:00 AM - 11:30 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session type:&lt;/strong&gt; Dev Chat&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Hall B, Expo, Developer Lounge, Booth #2818, &lt;/p&gt;

&lt;h3&gt;
  
  
  Rolf Koski: Building a sustainable practice for tomorrow
&lt;/h3&gt;

&lt;p&gt;PEX201&lt;br&gt;
Do you want to have a positive impact on environmental sustainability when you advance your practices? Attend this session to learn how to reimagine your sustainability posture, innovate new lean solutions, and optimize your cloud workloads with a positive impact on the environment. Learn about user patterns, software designs, and AWS service considerations that organizations of any size can apply to their workloads to transform businesses into a sustainable future. This session is intended for AWS Partners.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 4:45 PM - 5:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 200 - Intermediate&lt;br&gt;
&lt;strong&gt;Session type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Level 1, Encore Ballroom 2, Encore&lt;/p&gt;

&lt;h3&gt;
  
  
  Jon topper: MKT306: Build or transform to an AWS Well-Architected SaaS using AWS Control Tower
&lt;/h3&gt;

&lt;p&gt;MKT306&lt;br&gt;
Building AWS Well-Architected infrastructure is fundamental to building SaaS products that meet demanding compliance requirements around security, performance, and reliability. With AWS Marketplace professional services offerings, you can achieve compliance certification quickly and reduce per-tenant onboarding time. Join this chalk talk to learn best practices and approaches from experts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:15 PM - 1:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session type:&lt;/strong&gt;  Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Level 1, Montrachet 1, Wynn&lt;/p&gt;

&lt;h3&gt;
  
  
  Jimmy Dahlqvist: An IoT-enabled smoker for great BBQ
&lt;/h3&gt;

&lt;p&gt;COM203&lt;br&gt;
Maintaining a good temperature is important if you want to make really good BBQ. In this dev chat, take a look at how to create an IoT-enabled BBQ smoker using Raspberry Pi and AWS IoT Greengrass. Dive deep into the AWS services you need to create a serverless, event-driven system, using services like AWS IoT Core, AWS Lambda, and Amazon EventBridge that can be used to monitor and notify you when a temperature falls outside of a specified range.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 1:00 PM - 1:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 200 - Intermediate&lt;br&gt;
&lt;strong&gt;Session type:&lt;/strong&gt; Dev Chat&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h3&gt;
  
  
  Martijn van Dongen: Terraform providers using AWS CloudFormation custom resources
&lt;/h3&gt;

&lt;p&gt;BOA316&lt;br&gt;
Terraform, AWS CloudFormation, and AWS CDK support a broad set of AWS services. Often, users want to automate more and even create resources that are not supported out of the box. In this session, explore how to build CloudFormation custom resources using your chosen programming language or Docker container. This method has been successfully used with CloudFormation and CDK, and it also works well with Terraform. Using Terraform modules, you only have to add a few lines of code to your projects to use a custom provider. Explore how this solution is designed to be secure, simple, future-proof, and reliable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt; Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:30 PM - 1:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session type:&lt;/strong&gt; Breakout Session&lt;/p&gt;

&lt;h1&gt;
  
  
  AI &amp;amp; ML
&lt;/h1&gt;

&lt;h3&gt;
  
  
  New approaches to implementing quantum algorithms
&lt;/h3&gt;

&lt;p&gt;QTC403&lt;br&gt;
While it is becoming easier to implement popular quantum algorithms on hardware through services like Amazon Braket, it can be challenging to generalize these algorithms, prepare them to run at scale, and code them in a way that is both efficient and hardware-aware. Many organizations have begun investigating quantum computing but have discovered that moving forward with developing algorithms that are production-ready can seem like an insurmountable challenge. In this chalk talk, join AWS and Classiq, an AWS Partner and quantum software company, to walk through a customer journey for a financial use case based on Amazon Braket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt;  12:15 PM - 1:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; MGM Grand&lt;/p&gt;

&lt;h3&gt;
  
  
  Extract AI-driven customer insights using Post-Call Analytics
&lt;/h3&gt;

&lt;p&gt;AIM402&lt;br&gt;
Companies are transforming existing contact centers by adding AI/ML to deliver actionable insights and improve automation with existing telephony systems. Join this workshop to learn how to use the AWS Contact Center Intelligence (CCI) Post-Call Analytics solution to derive AI-driven insights from virtually all customer conversations. Explore how you can automatically generate insights like sentiment, call issues, agent actions, and outcomes, as well as detect entities. See how you can increase the accuracy of the transcriptions using the custom vocabulary features of Amazon Transcribe. Finally, move on to building analytics dashboards on Amazon QuickSight to report the metrics that are important to your business. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 4:00 PM - 6:00 PMa&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; WorkshoÅp&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Train and deploy large language models on Amazon SageMaker
&lt;/h3&gt;

&lt;p&gt;AIM405&lt;br&gt;
Join this chalk talk to dive deep into training and hosting large language models for high accuracy and low cost using Hugging Face Transformers on Amazon SageMaker. Also, hear how Mantium achieved high-throughput training and low-latency inference for large transformer models using SageMaker distributed training libraries and DeepSpeed on SageMaker.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:45 AM - 12:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  From cutting-edge ML research to product with the Amazon Research Awards
&lt;/h3&gt;

&lt;p&gt;AIM408&lt;br&gt;
The Amazon Research Awards (ARA) oﬀers unrestricted funds and AWS promotional credits to support research at academic institutions and nonprofit organizations in areas that align with the mission to advance customer-obsessed science. Join this session to learn about current state-of-the-art research from ARA awardees and how you can take advantage of these cutting-edge ML innovations using AWS services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:30 PM - 1:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Choosing the right ML instance for training and inference on AWS
&lt;/h3&gt;

&lt;p&gt;AIM407&lt;br&gt;
As a data scientist, choosing the right compute instance for your workload on AWS can be challenging. On AWS, you can choose from CPUs, GPUs, AWS Trainium, and Intel Habana Gaudi to accelerate training. For inference, you can choose from CPUs, GPUs, and AWS Inferentia. This chalk talk guides you through how to choose the right compute instance type on AWS for your deep learning projects. Explore the available options, such as the most performant instance for training, the best instance for prototyping, and the most cost-effective instance for inference deployments. Learn how to use Amazon SageMaker Inference Recommender to help you make the right decision for your inference workloads.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:30 AM - 12:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;/p&gt;

&lt;h3&gt;
  
  
  Maximize margins with AI/ML dynamic pricing models
&lt;/h3&gt;

&lt;p&gt;TRV401&lt;br&gt;
Many airlines want to adapt pricing in real time to match market conditions, making pricing models more complex. In this chalk talk, learn how you can use machine learning to drive increased margins with the appropriate risk mitigation and customer experience controls. Using Amazon SageMaker, Amazon Athena, AWS Lambda, and Amazon Kinesis, walk through data feature engineering considerations, example model approaches for demand forecasting, and margin adjustment techniques using search and booking data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:15 PM - 1:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Prepare data and model features for ML with ease, speed, and accuracy
&lt;/h3&gt;

&lt;p&gt;AIM326&lt;br&gt;
Join this chalk talk to learn how to prepare data for ML in minutes using Amazon SageMaker. SageMaker offers tools to simplify data preparation so that you can label, prepare, and understand your data and engineer model features. Walk through a complete data preparation workflow, and learn how to extract data from multiple data sources and transform it using the prebuilt visualization templates in SageMaker Data Wrangler. Then, discover how to store, update, retrieve, and share model features using SageMaker Feature Store for usage in training and inference.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 10:45 AM - 11:45 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Explainable attention-based NLP using perturbation methods
&lt;/h3&gt;

&lt;p&gt;BOA401&lt;br&gt;
Explainable AI has gained a lot of attention from both legislators and the scientific community. There are many advantages of being able to explain the reasoning behind the decisions a model makes, top among them are fairness, accountability, and causality. More and more, explainability is used to improve both human and machine decision-making in a mutually reinforcing loop. This session specifically focuses on post hoc local explainability for transformer-based NLP. This session first details the general and scientific methods for explaining BERT and then explores challenges for a real-world implementation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:15 PM - 1:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; MGM Grand&lt;/p&gt;

&lt;h3&gt;
  
  
  Build human-like customer experiences with conversational AI
&lt;/h3&gt;

&lt;p&gt;AIM403&lt;br&gt;
Today’s contact centers provide omnichannel services so that customers can interact by chat or phone. With shorter hold times, chat has helped solve some of the challenges associated with traditional voice-based contact centers, but it is still constrained by the need for live agents. While interactive voice response (IVR) systems help with some automation, their deterministic behavior frustrates many customers. Contact centers can solve for this using conversational AI. In this workshop, learn how you can use powerful conversational AI with Amazon Lex, Amazon Kendra, and the Amazon Chime SDK to build fully automated human-like chat experiences and alleviate challenges with IVR systems. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:45 AM - 1:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Exploring applications of QNNs and Monte Carlo simulations
&lt;/h3&gt;

&lt;p&gt;QTC402&lt;br&gt;
This chalk talk provides a close look at several cutting-edge applications for future quantum computers. First, learn how AWS Partner QC Ware and healthcare industry leader Roche are using Amazon Braket to build more interpretable and well-behaved quantum neural networks (QNNs) for medical image recognition. Then, discover how QC Ware and financial services organizations are building solutions with Amazon Braket to speed up financial Monte Carlo simulations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 4:15 PM - 5:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Scaling ML training and inference workloads on Amazon EC2 and PyTorch
&lt;/h3&gt;

&lt;p&gt;CMP401&lt;br&gt;
Attend this chalk talk with Meta PyTorch and AWS to learn about distributed training, accelerated inference, bin-packing models, and scaling with PyTorch on AWS. In addition, hear about relevant large-companies that use AWS infrastructure and services for distributed training and to serve thousands of models with low latency and cost.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 1:45 PM - 2:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Train and host foundation models with PyTorch on AWS
&lt;/h3&gt;

&lt;p&gt;AIM404&lt;br&gt;
Foundation models with hundreds of billions of parameters activate new applications for machine learning. Models such as OPT-175B, BLOOM, Jurassic, GPT-3, and DALL-E demonstrate exciting new use cases for text and image generation, but training and serving these models for inference poses new challenges. In this session, learn how to train and serve foundation models at scale using PyTorch on AWS. Discover how to reduce training time and cost by optimizing compute, network communication, input/output, checkpointing, and offloading from GPUs to CPUs. Also learn how to pick the right model parallelization strategies and best practices for training and serving these models.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 7:00 PM - 8:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h3&gt;
  
  
  Elastic Fabric Adapter advanced topics for AI/ML and HPC
&lt;/h3&gt;

&lt;p&gt;CMP407&lt;br&gt;
Elastic Fabric Adapter (EFA), built on the Scalable Reliable Datagram (SRD) protocol, is the foundation for scaling message passing interface (MPI) and NVIDIA Collective Communications Library–based high performance computing (HPC) and machine learning (ML) codes at AWS. This builders’ session addresses application scaling challenges and is led by an AWS EFA team principal engineer. It includes a roundtable-style discussion, so come prepared to share your thoughts and questions. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 7:45 PM - 8:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Builders' Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Wynn&lt;/p&gt;

&lt;h3&gt;
  
  
  Using AutoML to develop deep learning solutions automatically
&lt;/h3&gt;

&lt;p&gt;BOA402&lt;br&gt;
Many companies, startups and enterprises alike, do not have the teams and processes needed to create their own research teams. The promise of AutoML is that it allows companies and scientists to create deep learning solutions without possessing a deep knowledge of deep learning. This workshop focuses on describing AutoML methods and walks you through a practical example of creating a solution on a public dataset using AutoGluon, an open-source AutoML framework developed by AWS. Finally, look into Amazon SageMaker Autopilot, which is developed using AutoGluon. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Thursday, December 1&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:00 AM - 1:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Deploy deep learning models with hyperscale performance on SageMaker
&lt;/h3&gt;

&lt;p&gt;AIM401&lt;br&gt;
In this workshop, explore the hardware and software optimizations available for deep learning model deployment, when and how to use them, and the impact on the model performance and cost. Learn how to run thousands of deep learning models using Amazon SageMaker multi-model endpoints and reduce model serving costs. Learn how to use the AWS Neuron SDK to compile and deploy models on AWS Inferentia chips. Lastly, dive deep into model compilation and optimization techniques using TensorRT and NVIDIA Triton Inference Server features to achieve hyperscale inference on Amazon SageMaker. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 8:30 AM - 10:30 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h1&gt;
  
  
  architecture
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Achieving extreme resiliency through recovery-oriented architectures
&lt;/h3&gt;

&lt;p&gt;ARC401&lt;br&gt;
This chalk talk describes application partitioning strategies, failure mitigations, and trade-offs for designing extremely resilient recovery-oriented architectures. Developers can achieve extreme levels of resilience by building redundant application replicas and designing routing policies that can detect and remove impaired application replicas from service: these are called recovery-oriented architectures. Using recovery-oriented architectures, service owners can make changes one replica at a time and can immediately remove a replica from service when it experiences an impairment. When application replicas are aligned to AWS fault domains, including Availability Zones or AWS Regions, infrastructure impairments can be contained to an individual application replica and applications recove...&lt;br&gt;
See more&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 5:30 PM - 6:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Wynn&lt;/p&gt;

&lt;h3&gt;
  
  
  Proactive auto scaling for optimal cost and availability
&lt;/h3&gt;

&lt;p&gt;CMP412&lt;br&gt;
Amazon EC2 Auto Scaling groups allow you to extract the elasticity benefits of the AWS Cloud. In this workshop, learn how to make the most of the latest innovations from Amazon EC2 Auto Scaling to further improve your web application availability at lower costs. Specifically, dive into using a combination of predictive scaling, dynamic scaling, and warm pool features to automatically launch and terminate capacity with changing demands throughout a day. With more responsive and proactive scaling, you run only the required number of instances at any time of the day, reducing the cost of overprovisioned Amazon EC2 instances. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 5:30 PM - 7:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt;  Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h3&gt;
  
  
  Concepts to take your Kubernetes operations and scale to the next level
&lt;/h3&gt;

&lt;p&gt;CON402&lt;br&gt;
In this workshop, learn about Kubernetes best practices and dive into the critical concepts, tools, and configurations necessary to ensure application health. Gain hands-on experience securing, scaling, and observing Kubernetes to identify, debug, and improve overall performance. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Thursday, December 1&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 2:00 PM - 4:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; MGM Grand&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced VPC design and new Amazon VPC capabilities
&lt;/h3&gt;

&lt;p&gt;NET302&lt;/p&gt;

&lt;p&gt;&lt;em&gt;This is must see if you studying for a the SA Pro Exam&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Amazon VPC gives you more control over your AWS virtual networking environment. Given this ability, have you ever wondered how new Amazon VPC features might affect how you design your AWS networking infrastructure or change your current architecture? In this session, learn about the new design and capabilities of Amazon VPC and how you might use them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 8:30 AM - 9:30 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Level 1, Forum 121, Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  A day in the life of a billion requests
&lt;/h3&gt;

&lt;p&gt;SEC404&lt;br&gt;
Every day, sites around the world authenticate their callers. That is, they verify cryptographically that the requests are actually coming from who they claim to come from. In this session, learn about unique AWS requirements for scale and security that have led to some interesting and innovative solutions to this need. How did solutions evolve as AWS scaled multiple orders of magnitude and spread into many AWS Regions around the globe? Hear about some of the recent enhancements that have been launched to support new AWS features, and walk through some of the mechanisms that help ensure that AWS systems operate with minimal privileges.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 10:45 AM - 11:45 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS Nitro: Enhancing security and future-proofing your instances
&lt;/h3&gt;

&lt;p&gt;CMP404&lt;br&gt;
AWS has completely reimagined the virtualization infrastructure with the AWS Nitro System, which provides a rich collection of building blocks that offload many traditional virtualization functions to dedicated hardware and software to deliver high performance, high availability, and high security while reducing virtualization overhead. In this chalk talk, dive deep into the Nitro System and discover how AWS extended the Nitro System’s modern hardware and software so organizations can run their workloads well beyond the typical lifetime of the underlying hardware.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 10:45 AM - 11:45 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Grand&lt;/p&gt;

&lt;h3&gt;
  
  
  Enhancing OS and application security on AWS Graviton–based instances
&lt;/h3&gt;

&lt;p&gt;CMP409&lt;br&gt;
AWS Graviton processors feature key capabilities that enable you to run cloud-native applications securely and at scale. AWS Graviton3 processors feature always-on memory encryption, dedicated caches for every vCPU, and support for pointer authentication. In this chalk talk, dive deep into how developers can build secure applications using pointer authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 2:30 PM - 3:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Enabling multi-party analysis of sensitive data using AWS Nitro Enclaves
&lt;/h3&gt;

&lt;p&gt;CMP403&lt;br&gt;
In this chalk talk, learn how to utilize AWS Nitro Enclaves, take advantage of its unique capabilities, and integrate with services such as AWS KMS. Use these options to enable collaboration and multi-party computation of sensitive datasets in a secure manner, without revealing the dataset of any party to another.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 3:15 PM - 4:15 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Building modern data architectures on AWS
&lt;/h3&gt;

&lt;p&gt;ARC313&lt;br&gt;
The modern data architecture is an evolution from data warehouse and data lake–based solutions, and it allows you to query data across your data warehouse, data lake, and purpose-built analytics services to gain faster and deeper insights that would not be possible otherwise. In this session, learn how to design, create, and operate a modern data architecture on AWS by using AWS purpose-built data services. Also learn about modern data architecture concepts, why customers are adapting to it, modern data architecture pillars, reference architectures, and best practices while building modern data architectures on AWS for optimal performance and cost effectiveness.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 9:15 AM - 10:15 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;p&gt;Canary deployment with metrics-driven rollback ability for Amazon ECS&lt;br&gt;
CON409&lt;br&gt;
In this builders’ session, learn about the importance of deployment strategies and how to implement a CI/CD strategy using AWS CodeSuite services for performing canary deployments on Amazon ECS, with AWS App Mesh for communication between the microservices. The session also covers what can be achieved with AWS CodeDeploy and the differences in our reference architecture, which does not require ELBs for internal communication. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Thursday, December 1&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 3:30 PM - 4:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Builders' Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h1&gt;
  
  
  Serverless
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Best practices for advanced serverless developers
&lt;/h3&gt;

&lt;p&gt;SVS401&lt;br&gt;
Are you an experienced serverless developer? Do you want a helpful guide for unleashing the full power of serverless architectures for your production workloads? Are you wondering whether to choose a stream or an API as your event source, or whether to have one function or many? This session provides architectural best practices, optimizations, and useful cheat codes that you can use to build secure, high-scale, and high-performance serverless applications and uses real customer scenarios to illustrate the benefits.&lt;br&gt;
Reservations available soon&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:30 PM - 1:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Mandalay Bay&lt;/p&gt;

&lt;h3&gt;
  
  
  Automated testing for serverless applications
&lt;/h3&gt;

&lt;p&gt;COM401&lt;br&gt;
During this dev chat, learn briefly about the theory behind testing serverless applications in the cloud, using real AWS services instead of locally emulated resources. The chat covers common and advanced topics, such as how and when to write integration, and when end-to-end tests; how to test asynchronous workflows; and how to configure your serverless project so that tests can be fully automated. Bring your questions for answers using suitable examples, including code samples and demos.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Wednesday, November 30&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 12:30 PM - 1:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Dev Chat&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h1&gt;
  
  
  Analytics
&lt;/h1&gt;

&lt;h3&gt;
  
  
  How Poshmark accelerates growth via real-time analytics &amp;amp; personalization
&lt;/h3&gt;

&lt;p&gt;ANT342&lt;br&gt;
In this session, learn how Poshmark is focusing on achieving top-line growth through personalization and enhanced user experience. The initial approach of using batch processing for personalization and security did not meet expectations for customer experience improvement. To enhance customer experience, Poshmark designed real-time personalization using real-time event capture with Amazon MSK and real-time data enrichment with Amazon Kinesis Data Analytics. Amazon SageMaker and Kinesis Data Firehose supported real-time personalization by saving copies of the personalizations to Amazon S3. This solution improved customer experience, reduced security risks, and allowed end users to more confidently interact with the Poshmark app.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 8:30 AM - 9:30 AM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 300 - Advanced&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Build a web-scale application with purpose-built databases &amp;amp; analytics
&lt;/h3&gt;

&lt;p&gt;DAT401&lt;br&gt;
In this workshop, learn to build modern web applications at scale using purpose-built databases. Discover how to apply development patterns using Amazon DynamoDB, Amazon ElastiCache, Amazon Neptune, and Amazon OpenSearch Service to build a fully functional and scalable bookstore ecommerce application and dive deep into best practices along the way. Basic familiarity with AWS concepts and services such as IAM, VPC, networking, and storage services is recommended. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 10:00 AM - 12:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Event detection with Amazon MSK and Amazon Kinesis Data Analytics
&lt;/h3&gt;

&lt;p&gt;ANT403&lt;br&gt;
In this workshop, take on the role of a technology manager for a Las Vegas casino. Your assignment is to create a stream processing application that identifies customers entering your casino who have previously gambled with large sums of money and sends you a text message when big spenders sit down at a gaming table. To do this, use Amazon MSK to capture events, Amazon Kinesis Data Analytics Studio to detect events of interest, and AWS Lambda with Amazon SNS to send you a message for any events. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 10:00 AM - 12:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Workshop&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Wynn&lt;/p&gt;

&lt;h3&gt;
  
  
  Deliver a connected travel experience with distributed data mesh
&lt;/h3&gt;

&lt;p&gt;TRV402&lt;br&gt;
As more travel and hospitality businesses migrate to the cloud, there’s an opportunity to integrate global datasets to offer travelers and guests a seamless, connected experience across the businesses they interact with on a single journey. Learn how your organization can interact with trusted third-party entities while protecting customer data and adhering to privacy regulations by synchronously and asynchronously serving your data as a product in a distributed, decentralized, and multi-cloud data mesh.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Monday, November 28&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 1:45 PM - 2:45 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Chalk Talk&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h3&gt;
  
  
  Human vs. machine: Amazon Redshift ML inferences
&lt;/h3&gt;

&lt;p&gt;ANT402&lt;br&gt;
In this builders’ session, use Amazon Redshift machine learning (ML) capabilities to draw inferences for a speed race between humans and machines. A training dataset from previous races is provided for you to build and train an ML model. Use simple SQL to build the ML model and then draw inferences using the inference dataset provided. You must bring your laptop to participate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Tuesday, November 29&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:00 AM - 12:00 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 400 - Expert&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Builders' Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;h1&gt;
  
  
  Other
&lt;/h1&gt;

&lt;p&gt;How Amazon uses AWS IoT to improve sustainability across its buildings&lt;br&gt;
IOT204&lt;br&gt;
Sustainability has quickly become a top operational priority for companies managing large real estate portfolios. A key challenge companies face is deciding how to make their efforts more data-driven, starting at the edge. This means rethinking the IoT layer for a new generation of cloud-based insights that can scale to meet the needs of companies that manage millions of square feet of physical space. In this session, see how Amazon uses AWS IoT services in its physical locations to scale sustainability efforts and remove blockers that impede data-driven decision-making.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:30 AM - 12:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 200 - Intermediate&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Caesars Forum&lt;/p&gt;

&lt;p&gt;Take these open-source tools on your AWS adventure&lt;br&gt;
BOA202&lt;br&gt;
You’ve set out on a grand adventure to learn, build and expand on AWS. Like any good adventure, it has its challenges. Time to gear up! Grab your best tools and gear to help you on your way. In this session, have a look at open-source tools that can help make your AWS adventure easier. See something for security and permissions, something for cost management, and a few more things for building in the cloud—tools like Infracost, IAMLive, and more.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Date:&lt;/strong&gt;  Friday, December 2&lt;br&gt;
&lt;strong&gt;Time:&lt;/strong&gt; 11:30 AM - 12:30 PM&lt;br&gt;
&lt;strong&gt;Session level:&lt;/strong&gt; 200 - Intermediate&lt;br&gt;
&lt;strong&gt;Session Type:&lt;/strong&gt; Breakout Session&lt;br&gt;
&lt;strong&gt;Location:&lt;/strong&gt; Venetian&lt;/p&gt;

&lt;h1&gt;
  
  
  See you there
&lt;/h1&gt;

&lt;p&gt;For this going to reinvent I hopes to see you their and DM on twitter (@philipbasford) and lets meet up &lt;/p&gt;

</description>
      <category>aws</category>
    </item>
    <item>
      <title>Fish Cam : Live water temperature monitoring</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Sat, 02 Apr 2022 10:08:11 +0000</pubDate>
      <link>https://forem.com/aws-builders/fish-cam-live-water-temperature-monitoring-49fm</link>
      <guid>https://forem.com/aws-builders/fish-cam-live-water-temperature-monitoring-49fm</guid>
      <description>&lt;p&gt;In this blog I will be looking into how to implement live monitoring of water temperature inside the Fish tank where my daughters two goldfish, Goldie and Star, spend their days swimming in. The tank is located next to the desk that I am writing this blog from.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further background:
&lt;/h2&gt;

&lt;p&gt;My "Office" (aka my daughters bedroom) is located at the front of our house. Our house is north facing, therefore in the morning the sun is on the back of the house and in the afternoon it is on the front where my "Office" is. My "Office  has a window that I look out from when I am working from home. In late afternoon the sun especially shines straight in this window (you may see this on some of the YouTube videos that I have done). &lt;/p&gt;

&lt;p&gt;The main thing here is the safety of the Fish. They ** must not** be put in direct sunshine, this would heat the water too much and also increase the algae due to the sunlight. So an extra safeguard I typically close the curtains when the sun is very bright or strong. However the ambient temperature of the room still increases and this means the water temperature of the tank also increases. Therefore it would be good to know the water temperature as Goldfish require a range of between 20°C and 23°C.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hardware:
&lt;/h2&gt;

&lt;p&gt;To measure water temperature you need to invest in some additional components to extend your RaspberryPI. Both a submersible temperature Sensor like the DS18B20 (’&lt;a href="https://www.amazon.co.uk/dp/B08HHWD1K9"&gt;https://www.amazon.co.uk/dp/B08HHWD1K9&lt;/a&gt;) and a breadboard is essential for this. The DS18B20 is ideal as it has an integrated digital interface where you can read straight from it and you do not need to fiddle with an analog converter. &lt;/p&gt;

&lt;p&gt;To setup the hardware you need to do the following: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Shutdown your Pi and unplug it&lt;/li&gt;
&lt;li&gt;Connect your breadboard to the GPIO on the Pi&lt;/li&gt;
&lt;li&gt;Connect the sensor to the breadboard. &lt;/li&gt;
&lt;li&gt;Start up your Pi&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is a wiring diagram that helps, thanks to Circuit Basics:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pUoguztz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d1fcr6255i3q9lle0qfr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pUoguztz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d1fcr6255i3q9lle0qfr.png" alt="Circuit" width="880" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Some tips:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;A pull-up resistor of at least 4.7 Ohm is required. This basically allows the sensor's reading to be between a value of 0 and 5 volts and that the digital interface contained within the housing can understand it. The PI then reads this as a  temperature from the digital interface. I had to use a resistor from my breadboard kit and this website will help you workout what Ohm is your resistor is &lt;a href="https://www.calculator.net/resistor-calculator.html"&gt;https://www.calculator.net/resistor-calculator.html&lt;/a&gt;. If you need use a couple of resistors then make sure you wire them in series. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Notice the sensor above comes with a connector at the end. For my first attempt to wire this up, I cut off the connector, then I stripped the wires and tried push them into my breadboard. This was a mistake and did not work. Luckily my breadboard came with a single row pin header. This allowed me to put the row pin header into the board and connector on to the row pin header. Much easier but make sure you place the single row pin header horizontally not vertically.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;My final breadboard wiring looked like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--04KccaVj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gtaxfjwlzftzy0ymwd4r.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--04KccaVj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gtaxfjwlzftzy0ymwd4r.jpg" alt="Board" width="880" height="1087"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WT4yRZYK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/92x6b1sss91eol4wlg6j.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WT4yRZYK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/92x6b1sss91eol4wlg6j.jpg" alt="GPIO" width="880" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Turning on and setting up the sensor:
&lt;/h2&gt;

&lt;p&gt;You will need to turn on the digital(one-wire) interface on the PI (GPIO 4). This can be done using the raspi-config.&lt;/p&gt;

&lt;p&gt;Select the Interfaces Menu:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zTs630vB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c95i7nhvhsh70stjtnif.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zTs630vB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c95i7nhvhsh70stjtnif.png" alt="Select Interfaces" width="880" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select one-wire:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kaoqdIYe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vuniswpis9dzwpcpy54g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kaoqdIYe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vuniswpis9dzwpcpy54g.png" alt="Select one wire" width="880" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Turn on one-wire:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--awu4Y-Z8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/og2nhh2t9ia7ve37rmr0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--awu4Y-Z8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/og2nhh2t9ia7ve37rmr0.png" alt="turn on" width="880" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then restart your Pi&lt;/p&gt;

&lt;h3&gt;
  
  
  Testing
&lt;/h3&gt;

&lt;p&gt;Using cat then we can see if we can read the temperature value straight from the digital interface(one wire). One-wire mounts devices at /sys/bus/w1/devices/. This folder should contain another folder with the name of the UUID of this sensor. This is in case you have more than one and each one would have its own folder. With in the sensor's folder then there is a file called 'temperature' and it contents is the °C value from the sensor.&lt;/p&gt;

&lt;p&gt;Here is an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh pi                       
Linux raspberrypi 5.10.17-v7+ #1414 SMP Fri Apr 30 13:18:35 BST 2021 armv7l

The programs included with the Debian GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.
Last login: Sat Apr  2 10:53:44 2022 from 192.168.x.x
pi@raspberrypi:~ $ ls /sys/bus/w1/devices/
28-00000001a7df  w1_bus_master1
pi@raspberrypi:~ $ cat /sys/bus/w1/devices/28-00000001a7df/temperature 
20625
pi@raspberrypi:~ $ 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So that is 20.625c.&lt;/p&gt;

&lt;h1&gt;
  
  
  Lambda:
&lt;/h1&gt;

&lt;p&gt;In order to get the temperature from the one wire interface into AWS we need to write some custom code. To do this I used a lambda written in Python. The lambda reads the temperature from the interface like any other normal file. The value is then parsed and turned into a JSON object. Then using the publish option in the Greengrass SDK I send the message via an MQTT topic into IoT core. Here is some very important concerns/notes when doing this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The lambda needs to be a long running lambda. This basically means it is not a lambda at all. It is in fact a Python app that is deployed via GreenGrass and started when Greengrass starts. It needs a run loop, poll the temperature and publish. Listening to sigterm and sigint&lt;/li&gt;
&lt;li&gt;It needs to run as a container with read access to /sys on the host.&lt;/li&gt;
&lt;li&gt;The Greengrass v2 and v1 way off sending MQTT messages and how a deployment is done is completely different. My code was originally v1 so I had to deploy the LegacySubscriptionRouter with the correct configuration to make it compatible &lt;/li&gt;
&lt;li&gt;I had to do a OTA upgrade of nucleus so that my other components were deployable &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here is the main function executed within the run loop:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def process(iot_client: Client):

    # conh
    sensor_id = os.getenv('TEMP_ID', '28-00000001e2d1')
    path = os.getenv('TEMP_PATH', '/sys/bus/w1/devices/')

    temperature = gettemp(sensor_id, path) / float(1000)
    msg = {
        'sensor_id': sensor_id,
        'temperature': temperature
    }

    send_message(iot_client, msg)

    return msg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is how to read the temperature:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def gettemp(sensor_id, path):
    try:
        mytemp = ''
        filename = 'w1_slave'
        f = open(path + sensor_id + '/' + filename, 'r')
        line = f.readline()  # read 1st line
        crc = line.rsplit(' ', 1)
        crc = crc[1].replace('\n', '')
        if crc == 'YES':
            line = f.readline()  # read 2nd line
            mytemp = line.rsplit('t=', 1)
        else:
            mytemp = 99999
        f.close()

        return int(mytemp[1])

    except Exception as e:
        LOGGING.exception(e)
        return 99999
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is how to send the message via MQTT (note this is V1 style):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def send_message(iot_client: greengrasssdk.IoTDataPlane.Client, message: dict, topic=TOPIC):

    iot_client.publish(
        topic=topic,
        payload=json.dumps(message).encode("utf-8")
    )

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now if you want to look at code in more detail it is located [&lt;a href="https://github.com/philbasford/fishcam"&gt;https://github.com/philbasford/fishcam&lt;/a&gt;] and also note I deployed this using AWS SAM.&lt;/p&gt;

&lt;h1&gt;
  
  
  IOT Component:
&lt;/h1&gt;

&lt;p&gt;First select the Fish Cam Lambda and use the lambda name and version as the initial component details. However you may need to update the version later if you need to reconfigure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3stQgMLK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/250f8bbngioeqwuhe14y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3stQgMLK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/250f8bbngioeqwuhe14y.png" alt="Select Lambda" width="880" height="645"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As mention before one of the most important things is that the lambda is pinned/long running, so that it starts up and down with Greengrass. Also remove any event source:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--epMqSL3c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/51iijzz0s70jlg6zp70t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--epMqSL3c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/51iijzz0s70jlg6zp70t.png" alt="Long Running" width="880" height="585"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure you use a container but with access to /sys:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--B7eFlnDe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jax4zsqcngigfjgiwnm0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B7eFlnDe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jax4zsqcngigfjgiwnm0.png" alt="Container" width="880" height="648"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Create IOT Deployment:
&lt;/h1&gt;

&lt;p&gt;Create a new deployment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--arAoMydn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kumdzjaovzl15k02re0u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--arAoMydn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kumdzjaovzl15k02re0u.png" alt="New Deployment" width="880" height="496"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Add the Lambda component you created, then the legacy subscription router and nucleus public components.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8oeLPcWC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hus8vtx2fgzvux17al8a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8oeLPcWC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hus8vtx2fgzvux17al8a.png" alt="Add Components" width="880" height="647"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now go into the legacy subscription router configuration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QydQWQBj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/18s9n3xqztk847odjvxs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QydQWQBj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/18s9n3xqztk847odjvxs.png" alt="Configure Legacy" width="880" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Add the following(change the function name to match yours):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "subscriptions": {
        "Greengrass_FishCam_to_cloud": {
            "id": "Greengrass_FishCam_to_cloud",
            "source": "component:cfFishCam-FishCamFunction-R7vEIfJai8vl",
            "subject": "fishcam/temperature",
            "target": "cloud"
        }
    }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then submit it:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oQNXC5g---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wzq6jd43dmrbcpizcr9m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oQNXC5g---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wzq6jd43dmrbcpizcr9m.png" alt="Legacy Settings" width="880" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Leave the Advance Settings as is:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y1sqBDnY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mddwwg30t5xkjxxxve52.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y1sqBDnY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mddwwg30t5xkjxxxve52.png" alt="Advance Settings" width="880" height="557"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then deploy:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jsuZdeEC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wh6rrnnc8shj6gu7wzul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jsuZdeEC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wh6rrnnc8shj6gu7wzul.png" alt="Deploy" width="880" height="586"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now if it deployed ok, then within the test area of IOT Greengrass section of the AWS Console you should see the messages coming in:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7Ndi5MvZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lwd3waimsmftj6b5md62.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7Ndi5MvZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lwd3waimsmftj6b5md62.png" alt="MQTT" width="880" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  IOT Rule &amp;amp; CloudWatch
&lt;/h1&gt;

&lt;p&gt;To get the messages into CloudWatch you need to create a new rule in the Act area of IOT Greengrass section of the AWS Console:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9HAFfVmK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s6kcjp32je8mq4qdkm2e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9HAFfVmK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s6kcjp32je8mq4qdkm2e.png" alt="New Rule" width="880" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use a the following SQL to make sure you process all the messages sent to the topic:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NSJenXYU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/16gmurt30hx0zo9smt0c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NSJenXYU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/16gmurt30hx0zo9smt0c.png" alt="Select SQL" width="880" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The add action and select CloudWatch Metric:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ut08BNS0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gnsplyofxq1wysuuflp8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ut08BNS0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gnsplyofxq1wysuuflp8.png" alt="CW Rule" width="880" height="543"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then use the following settings with substitutions for the values contain in messages: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--J83izba_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lfcaws3mu5fyklasq3mt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--J83izba_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lfcaws3mu5fyklasq3mt.png" alt="CW Settings" width="880" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your need a IAM Role and here are the permissions needed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": {
        "Effect": "Allow",
        "Action": "cloudwatch:PutMetricData",
        "Resource": [
            "*"
        ]
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Conclusion:
&lt;/h1&gt;

&lt;p&gt;Measuring temperature of the water in the fish tank is really easy, I was able to do this in few hours once I had purchased the additional sensor. It is a good example of how to chain a lot AWS services together to make something useful at low cost.&lt;/p&gt;

&lt;p&gt;Here is the final CloudWatch chart for the temperature for a  a day, notice the as the sun comes up and goes down the temperature changes:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--deERkriM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8samb19z9q2xmt5wbe50.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--deERkriM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8samb19z9q2xmt5wbe50.png" alt="Chart" width="880" height="317"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thank you and I hope you found this interesting. Next time I will be looking at how to use machine learning to spot fish!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>machinelearning</category>
      <category>iot</category>
    </item>
    <item>
      <title>re:invent 2021 Sessions</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Thu, 04 Nov 2021 22:51:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/reinvent-2021-sessions-3pc6</link>
      <guid>https://forem.com/aws-builders/reinvent-2021-sessions-3pc6</guid>
      <description>&lt;p&gt;It is back and it is in person, it is re:invent! Now comes the biggest challenge, working out where to go and which sessions to catch. Therefore I have put this re:invent session guide/recommendations list together so that if your are into ML Ops, ML Architecture, Edge Computing or Data Analytics then it can act as a starting point. However please leave sometime in your schedule because as AWS announce new stuff they will add more sessions.&lt;/p&gt;

&lt;p&gt;Hang on, what if you are not going, then your are like me. I have to stay home for some personal reasons. However you can still catch the keynotes and leadership sessions on the live stream. Do not forget the big three keynotes; Adam Selipsky (taking over from Andy), Werner Vogels (always dev centric), Swami Sivasubramanian (all ML). Later you can also use this guide to watch the Breakout Session once AWS have made them available on demand.&lt;/p&gt;

&lt;h1&gt;
  
  
  Sessions
&lt;/h1&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Session ID&lt;/th&gt;
&lt;th&gt;Title&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;LINK&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;AIM320&lt;/td&gt;
&lt;td&gt;Implementing MLOps practices with Amazon SageMaker&lt;/td&gt;
&lt;td&gt;Breakout Session&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.google.com/calendar/render?action=TEMPLATE&amp;amp;text=AIM320%09Implementing%20MLOps%20practices%20with%20Amazon%20SageMaker&amp;amp;location=Palazzo%20C%2C%20The%20Venetian&amp;amp;details=Implementing%20MLOps%20practices%20helps%20data%20scientists%20and%20operations%20engineers%20collaborate%20to%20prepare%2C%20build%2C%20train%2C%20deploy%2C%20and%20manage%20models%20at%20scale.%20During%20this%20session%2C%20explore%20the%20breadth%20of%20MLOps%20features%20in%20Amazon%20SageMaker%20that%20help%20you%20provision%20consistent%20model%20development%20environments%2C%20automate%20ML%20workflows%2C%20implement%20CI%2FCD%20pipelines%20for%20ML%2C%20monitor%20models%20in%20production%2C%20and%20standardize%20model%20governance%20capabilities.%20Then%2C%20hear%20from%20Vanguard%20as%20they%20share%20their%20journey%20enabling%20MLOps%20to%20achieve%20ML%20at%20scale%20for%20their%20polyglot%20model%20development%20platforms%20using%20Amazon%20SageMaker%20features%2C%20including%20SageMaker%20projects%2C%20SageMaker%20Pipelines%2C%20SageMaker%20Model%20Registry%2C%20and%20SageMaker%20Model%20Monitor.&amp;amp;dates=20211202T223000Z%2F20211202T233000Z"&gt;Invite&lt;/a&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AIM407&lt;/td&gt;
&lt;td&gt;Train deep learning models at scale with Amazon SageMaker&lt;/td&gt;
&lt;td&gt;Breakout Session&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.google.com/calendar/render?action=TEMPLATE&amp;amp;text=AIM407%09Train%20deep%20learning%20models%20at%20scale%20with%20Amazon%20SageMaker&amp;amp;location=Palazzo%20A%2C%20The%20Venetian&amp;amp;details=Today%2C%20AWS%20customers%20use%20Amazon%20SageMaker%20to%20train%20and%20tune%20millions%20of%20machine%20learning%20(ML)%20models%20with%20billions%20of%20parameters.%20In%20this%20session%2C%20learn%20about%20advanced%20SageMaker%20capabilities%20that%20can%20help%20you%20manage%20large-scale%20model%20training%20and%20tuning%2C%20such%20as%20distributed%20training%2C%20automatic%20model%20tuning%2C%20optimizations%20for%20deep%20learning%20algorithms%2C%20debugging%2C%20profiling%2C%20and%20model%20checkpointing%2C%20so%20that%20even%20the%20largest%20ML%20models%20can%20be%20trained%20in%20record%20time%20for%20the%20lowest%20cost.%20Then%2C%20hear%20from%20Aurora%2C%20a%20self-driving%20vehicle%20technology%20company%2C%20on%20how%20they%20use%20SageMaker%20training%20capabilities%20to%20train%20large%20perception%20models%20for%20autonomous%20driving%20using%20massive%20amounts%20of%20images%2C%20video%2C%20and%203D%20point%20cloud%20data.&amp;amp;dates=20211201T023000Z%2F20211201T033000Z"&gt;Invite&lt;/a&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AIM405&lt;/td&gt;
&lt;td&gt;Right-sizing Amazon SageMaker compute instances for ML model training and inference&lt;/td&gt;
&lt;td&gt;Chalk Talk&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.google.com/calendar/render?action=TEMPLATE&amp;amp;text=AIM405%09Right-sizing%20Amazon%20SageMaker%20compute%20instances%20for%20ML%20model%20training%20and%20inference&amp;amp;location=Summit%20217%2C%20Caesars%20Forum&amp;amp;details=Amazon%20SageMaker%20offers%20several%20compute%20capabilities%20and%20instance%20types%20to%20meet%20the%20needs%20of%20your%20machine%20learning%20(ML)%20use%20cases%2C%20including%20computer%20vision%2C%20recommendation%20engines%2C%20and%20natural%20language%20processing.%20In%20this%20session%2C%20discover%20the%20compute%20options%20available%20to%20you%2C%20including%20GPUs%2C%20CPUs%2C%20and%20AWS%20Inferentia.%20Learn%20the%20best%20practices%20and%20essential%20criteria%2C%20including%20price%2C%20performance%2C%20flexibility%20to%20support%20ML%20frameworks%2C%20and%20ease%20of%20use%2C%20so%20you%20can%20choose%20the%20right%20capabilities%20for%20your%20ML%20workload.&amp;amp;dates=20211130T013000Z%2F20211130T023000Z"&gt;Invite&lt;/a&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AIM413&lt;/td&gt;
&lt;td&gt;Detect machine learning model drift in production&lt;/td&gt;
&lt;td&gt;Chalk Talk&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.google.com/calendar/render?action=TEMPLATE&amp;amp;text=AIM413%09Detect%20machine%20learning%20model%20drift%20in%20production&amp;amp;location=Level%201%2C%20Forum%20104%2C%20Caesars%20Forum&amp;amp;details=One%20of%20the%20major%20factors%20that%20can%20affect%20the%20accuracy%20of%20models%20is%20the%20difference%20between%20data%20used%20to%20generate%20predictions%20and%20data%20used%20for%20training.%20For%20example%2C%20changing%20economic%20conditions%20could%20drive%20new%20interest%20rates%20affecting%20home%20purchasing%20predictions.%20Amazon%20SageMaker%20Model%20Monitor%20automatically%20detects%20drift%20in%20deployed%20models%20and%20provides%20detailed%20alerts%20that%20help%20identify%20the%20source%20of%20the%20problem%20and%20trigger%20model%20retraining%20and%20a%20validation%20pipeline.%20In%20this%20chalk%20talk%2C%20also%20learn%20how%20you%20can%20use%20human-in-the-loop%20to%20validate%20model%20predictions.&amp;amp;dates=20211130T030000Z%2F20211130T040000Z"&gt;Invite&lt;/a&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DOP211&lt;/td&gt;
&lt;td&gt;Building scalable machine learning pipelines&lt;/td&gt;
&lt;td&gt;Chalk Talk&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ARC323&lt;/td&gt;
&lt;td&gt;Designing Well-Architected machine learning workloads&lt;/td&gt;
&lt;td&gt;Chalk Talk&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;KYN003&lt;/td&gt;
&lt;td&gt;Swami Sivasubramanian Keynote&lt;/td&gt;
&lt;td&gt;Keynote&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AIM401&lt;/td&gt;
&lt;td&gt;Create, automate, and manage end-to-end ML workflows at scale&lt;/td&gt;
&lt;td&gt;Workshop&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.google.com/calendar/render?action=TEMPLATE&amp;amp;text=AIM401%09Create%2C%20automate%2C%20and%20manage%20end-to-end%20ML%20workflows%20at%20scale&amp;amp;location=Cristal%201%2C%20Wynn&amp;amp;details=In%20this%20workshop%2C%20learn%20how%20to%20orchestrate%20machine%20learning%20(ML)%20workflows%20across%20each%20step%20of%20the%20ML%20process%2C%20including%20exploring%20and%20preparing%20data%2C%20experimenting%20with%20different%20algorithms%20and%20parameters%2C%20training%20and%20tuning%20models%2C%20and%20deploying%20models%20to%20production%20using%20Amazon%20SageMaker%20Pipelines%20and%20SageMaker%20projects.%20SageMaker%20Pipelines%20makes%20it%20easy%20to%20manage%20dozens%20of%20ML%20models%20and%20massive%20volumes%20of%20data%20and%20to%20track%20thousands%20of%20training%20experiments.%20Learn%20how%20to%20share%20and%20reuse%20workflows%2C%20helping%20you%20scale%20ML%20throughout%20your%20organization.&amp;amp;dates=20211129T184500Z%2F20211129T210000Z"&gt;Invite&lt;/a&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;IOT306&lt;/td&gt;
&lt;td&gt;Building a people counter with anomaly detection using AWS IoT and ML&lt;/td&gt;
&lt;td&gt;Workshop&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h1&gt;
  
  
  Must sees
&lt;/h1&gt;

&lt;p&gt;Out of these 10, my highlights and must see list, that I can not wait to catch is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Implementing MLOps practices with Amazon SageMaker&lt;/strong&gt; : ML Ops is not all about pipelines, however they are one of the key technology enablers. This session is looking at Amazon SageMaker pipelines and how to rapidly deploy your models &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Detect machine learning model drift in production&lt;/strong&gt; : One of my personal passions I speak about is the 360 view of a model. This is the business impact, the ML performance (accuracy, etc), System Performance, Inference Observability, and Cost. This session will look in detail at ML performance and detecting drift &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Designing Well-Architected machine learning workloads&lt;/strong&gt; : AWS has just released an update to the &lt;a href="https://docs.aws.amazon.com/wellarchitected/latest/machine-learning-lens/wellarchitected-machine-learning-lens.pdf#machine-learning-lens"&gt;Well Architected ML Len&lt;/a&gt; and this session will cover all the best practice about how to lower your costs, keep your security rock tight, cope with failures, scale to millions of predictions and keep a watchful eye.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Fun!
&lt;/h1&gt;

&lt;p&gt;So what about the parties and the fun! If your in Vegas checkout:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Midnight Madness&lt;/li&gt;
&lt;li&gt;re:Play&lt;/li&gt;
&lt;li&gt;Amazon’s World Famous Chicken Wing Eating Contest&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you are staying home:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Checkout your local &lt;a href="https://aws.amazon.com/developer/community/usergroups/"&gt;user group&lt;/a&gt;. I know some are doing local watch parties! thats a nice way to still feel part of it.
&lt;/li&gt;
&lt;li&gt;Another fun thing also I know is happening is ComSum (Community Summit UK) &lt;a href="http://www.comsum.co.uk"&gt;www.comsum.co.uk&lt;/a&gt; will be back in action. ComSum will be giving you a byte size digest of the days big news and expert punditry on everything happening in Vegas.  This will include myself returning as one of their ML and Data Pundits. More details soon!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lastly, Don't worry I will be at re:invent next year, so watch out Vegas! &lt;/p&gt;

</description>
    </item>
    <item>
      <title>Fish Cam : Lights, Camera, Swim!</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Wed, 20 Oct 2021 12:12:35 +0000</pubDate>
      <link>https://forem.com/aws-builders/fish-cam-lights-camera-swim-5efe</link>
      <guid>https://forem.com/aws-builders/fish-cam-lights-camera-swim-5efe</guid>
      <description>&lt;p&gt;So far in this series we have looked at the &lt;a href="https://dev.to/aws-builders/fish-cam-background-architecture-and-hardware-3m4a"&gt;general background&lt;/a&gt; and how to get your &lt;a href="https://dev.to/aws-builders/fish-cam-hocking-up-fish-to-the-cloud-5fag"&gt;Raspberry PI setup&lt;/a&gt;. Now it is time to have a little more fun with what you can do with a Web Cam, a Raspberry PI and AWS. For me, my initial fun was creating a live stream of two Goldfish, Goldie and Star, swimming around their tank. I wanted to capture live footage stream it real-time to anywhere in the world (well where ever my daughter is on my iPhone!)&lt;/p&gt;

&lt;h1&gt;
  
  
  Initial Web Cam Setup
&lt;/h1&gt;

&lt;p&gt;Firstly plug your web cam into a USB port on your PI. Then via the desktop or remote access, run raspi-config and enable the camera. &lt;/p&gt;

&lt;p&gt;I did this via SSH:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh pi

sudo raspi-config
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This brings up the main menu and please select the Inference Options: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9tk56t95afc9igz19iet.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9tk56t95afc9igz19iet.png" alt="config menu"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then the Camera option and then enable it:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fes74rsrbhmb6cigle12o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fes74rsrbhmb6cigle12o.png" alt="Interface Menu"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnoijxizom1b0syk62be.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnoijxizom1b0syk62be.png" alt="Enable Camera"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your then your see a message confirming it is enabled (hopefully):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk93s33kuf54f070ngb6m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk93s33kuf54f070ngb6m.png" alt="Confirmation"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then your need to reboot:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F118k9juc6qc1jwqvlnkc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F118k9juc6qc1jwqvlnkc.png" alt="Reboot"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once rebooted, then ssh (or whatever you use) to go back into your PI and test it out. To test I used the standard fswebcam util to capture a frame:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install fswebcam

fswebcam -r 1280x720 image.jpg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then open up another connection, copy the file to your local machine and open the image:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;scp pi:image.jpg .
image.jpg                                                                                             100%  401KB   5.9MB/s   00:00 
open image.jpg
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see something or even your self in the image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5kytrpts3levadqtjkr7.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5kytrpts3levadqtjkr7.jpg" alt="Photo"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want more details then please see &lt;a href="https://www.raspberrypi.com/documentation/computers/os.html#using-a-usb-webcam" rel="noopener noreferrer"&gt;the official docs&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Capturing Video
&lt;/h1&gt;

&lt;p&gt;To stream live video then you need to push the output from the Web Cam to a Kinesis Video Stream. To do this AWS have a C producer lib/SDK that you need to compile on your PI, configure and then run. This uses the &lt;a href="https://gstreamer.freedesktop.org/" rel="noopener noreferrer"&gt;GStreamer Framework&lt;/a&gt; and here is link to producer project located on GIT Hub: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/awslabs/amazon-kinesis-video-streams-producer-sdk-cpp/releases" rel="noopener noreferrer"&gt;https://github.com/awslabs/amazon-kinesis-video-streams-producer-sdk-cpp/releases&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create Kinesis Video Stream
&lt;/h2&gt;

&lt;p&gt;In order to use the producer library first you need to create a stream. You can do this in the console or via the CLI (plus many more ways). I decided to use the CLI s it is simple enough:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws kinesisvideo create-stream --stream-name FishCam --region us-west-2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If this is successful then the  output contain a ARN for the stream and we will use the ARN in the next section. Therefore please take a note of it. &lt;/p&gt;

&lt;h2&gt;
  
  
  Create IAM User
&lt;/h2&gt;

&lt;p&gt;Next we will need create a IAM user for the library to use to authenticated against AWS. Now to be honest I would prefer to use an assumed role here and we might revisit this later in other articles in the series when using GreenGrass. For now however please create an IAM user and generate an Access Key and Secret Key, but please make sure you lock it down with something like the attach policy using the ARN (we noted down) as the resource:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kinesisvideo:DescribeStream",
                "kinesisvideo:Get*",
                "kinesisvideo:PutMedia"
            ],
            "Resource": [
                "arn:aws:kinesisvideo:us-west-2:123456789011:stream/FishCam/1634380180064"
            ]
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Install Prepresquities
&lt;/h2&gt;

&lt;p&gt;The Producer Library is C, so we have to compile and build the library. This requires some prepresquities and you can installvia apt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install pkg-config cmake m4 git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now to use GStreamer we need to install some more:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get install libssl-dev libcurl4-openssl-dev liblog4cplus-dev libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-plugins-base-apps gstreamer1.0-plugins-bad gstreamer1.0-plugins-good gstreamer1.0-plugins-ugly gstreamer1.0-tools
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Clone And Build
&lt;/h2&gt;

&lt;p&gt;Now we can compile and build the Producer Library, this can take sometime so I recommend doing it via a stable SSH or VNC connection.&lt;/p&gt;

&lt;p&gt;First clone the project from GIT:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/awslabs/amazon-kinesis-video-streams-producer-sdk-cpp.git
mkdir -p amazon-kinesis-video-streams-producer-sdk-cpp/build
cd amazon-kinesis-video-streams-producer-sdk-cpp/build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then run CMake and Make. The GStreamer plugin is optional however it is essential for our streaming. So make sure you pass it as an option to CMake as per below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cmake -DBUILD_GSTREAMER_PLUGIN=TRUE ..
make
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Testing it out
&lt;/h2&gt;

&lt;p&gt;So let’s test it out and see if we can get the stream running. First lets check the step-up by running the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ..
export GST_PLUGIN_PATH=`pwd`/build
export LD_LIBRARY_PATH=`pwd`/open-source/local/lib
gst-inspect-1.0 kvssink
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should result in an output like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Factory Details:
  Rank                     primary + 10 (266)
  Long-name                KVS Sink
  Klass                    Sink/Video/Network
  Description              GStreamer AWS KVS plugin
  Author                   AWS KVS &amp;lt;kinesis-video-support@amazon.com&amp;gt;

Plugin Details:
  Name                     kvssink
  Description              GStreamer AWS KVS plugin
  Filename                 /home/pi/amazon-kinesis-video-streams-producer-sdk-cpp/build/libgstkvssink.so
  Version                  1.0
  License                  Proprietary
  Source module            kvssinkpackage
  Binary package           GStreamer
  Origin URL               http://gstreamer.net/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;However if you see something like the statement below it means something failed and your need to look into the make logs to kind out why. Mostly it is the GStreamer dependancies not being present so check they installed correctly.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;No such element or plugin 'kvssink'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Start The Stream
&lt;/h2&gt;

&lt;p&gt;To start the streaming you need to take the IAM users credentials, the Stream name plus AWS Region name and replace the place holders in this command and then run it on your PI:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ gst-launch-1.0 autovideosrc device=/dev/video0 ! videoconvert ! video/x-raw,format=I420,width=640,height=480,framerate=30/1 ! 
                x264enc bframes=0 key-int-max=45 bitrate=500 ! video/x-h264,stream-format=avc,alignment=au,profile=baseline ! 
                kvssink stream-name=FishCam storage-size=512 access-key="YOUR_ACCESS_KEY" secret-key="YOUR_SECRET_ACCESS_KEY" aws-region="YOUR_AWS_REGION"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This should result in log that looks like below that tell you bytes streamed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2021-10-15 22:50:04 [1237316704] DEBUG - fragmentAckReceivedHandler invoked
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 1024 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postWriteCallback(): Curl post body write function for stream with handle: PICam and upload handle: 1 returned: {"EventType":"PERSISTED","FragmentTimecode":1634334601102,"FragmentNumber":"91343852333348846620194705670290539091726766250"}

2021-10-15 22:50:04 [1237316704] DEBUG - fragmentAckReceivedHandler invoked
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 303 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 296 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 696 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 427 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 871 bytes to Kinesis Video. Upload stream handle: 1
2021-10-15 22:50:04 [1237316704] DEBUG - postReadCallback(): Wrote 551 bytes to Kinesis Video. Upload stream handle: 1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you see something like below it means your not replaced the placeholders correctly (watch the quoting):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2021-10-15 22:03:50 [1389352032] DEBUG - describeStreamCurlHandler(): DescribeStream API response: 
2021-10-15 22:03:50 [1389352032] INFO - describeStreamResultEvent(): Describe stream result event.
2021-10-15 22:03:50 [1389352032] WARN - curlCompleteSync(): curl perform failed for url https://kinesisvideo.YOUR_AWS_REGION.amazonaws.com/describeStream with result Couldn't resolve host name: Could not resolve host: kinesisvideo.YOUR_AWS_REGION.amazonaws.com
2021-10-15 22:03:50 [1380959328] WARN - curlCompleteSync(): HTTP Error 0 : Response: (null)
Request URL: https://kinesisvideo.YOUR_AWS_REGION.amazonaws.com/describeStream
Request Headers:
    Authorization: AWS4-HMAC-SHA256 Credential=YOUR_ACCESS_KEY/20211015/YOUR_AWS_REGION/kinesisvideo/aws4_request, SignedHeaders=host;user-agent;x-amz-date, Signature=40201726f3815617c82ecdc62d5b9476936d19f81ce6cdda43fb3a05605bfc6c
    content-length: 37
    content-type: application/json
    host: kinesisvideo.YOUR_AWS_REGION.amazonaws.com
    user-agent: AWS
2021-10-15 22:03:50 [1380959328] DEBUG - describeStreamCurlHandler(): DescribeStream API response: 
2021-10-15 22:03:50 [1380959328] INFO - describeStreamResultEvent(): Describe stream result event.
2021-10-15 22:03:50 [1380959328] WARN - curlCompleteSync(): curl perform failed for url https://kinesisvideo.YOUR_AWS_REGION.amazonaws.com/describeStream with result Couldn't resolve host name: Could not resolve host: kinesisvideo.YOUR_AWS_REGION.amazonaws.com
2021-10-15 22:03:51 [1389352032] WARN - curlCompleteSync(): HTTP Error 0 : Response: (null)
Request URL: https://kinesisvideo.YOUR_AWS_REGION.amazonaws.com/describeStream
Request Headers:
    Authorization: AWS4-HMAC-SHA256 Credential=YOUR_ACCESS_KEY/20211015/YOUR_AWS_REGION/kinesisvideo/aws4_request, SignedHeaders=host;user-agent;x-amz-date, Signature=40201726f3815617c82ecdc62d5b9476936d19f81ce6cdda43fb3a05605bfc6c
    content-length: 37
    content-type: application/json
    host: kinesisvideo.YOUR_AWS_REGION.amazonaws.com
    user-agent: AWS

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Handy Bash Script
&lt;/h2&gt;

&lt;p&gt;To make things easier I then created the following script to start up the streaming when I needed. This saves me having to remember all the steps:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ~/amazon-kinesis-video-streams-producer-sdk-cpp
export GST_PLUGIN_PATH=`pwd`/build
export LD_LIBRARY_PATH=`pwd`/open-source/local/lib

gst-launch-1.0 autovideosrc device=/dev/video0 ! videoconvert ! video/x-raw,format=I420,width=640,height=480,framerate=30/1 ! x264enc bframes=0 key-int-max=45 bitrate=500 ! video/x-h264,stream-format=avc,alignment=au,profile=baseline !  kvssink stream-name="PICam" storage-size=512 access-key="XXXXX" secret-key="YYYYY" aws-region=us-west-2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;More details using the library can be found here :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/rekognition/latest/dg/streaming-using-gstreamer-plugin.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/rekognition/latest/dg/streaming-using-gstreamer-plugin.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hopefully you got it working and let's leave it running whilst we work on how to view the stream.&lt;/p&gt;

&lt;h1&gt;
  
  
  Viewing the Live Streaming Video
&lt;/h1&gt;

&lt;p&gt;That all good! but hay we still cannot see Goldie and Star swimming around that lovely tank we got them. Also let's remember the key requirement is seeing them doing their thing from wherever my daughter is on holiday using our tablets or phones. Therefore I decided to use &lt;a href="https://en.wikipedia.org/wiki/HTTP_Live_Streaming" rel="noopener noreferrer"&gt;HLS&lt;/a&gt; as most modern devices support it and you can uses it within a HTML5 webpage. Luckily Kinesis Video Streams supports HLS and they have a ready made HTML5 page, &lt;a href="https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/hls-playback.html" rel="noopener noreferrer"&gt;see here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To make this work we need to reuse IAM user and hardcode the same credentials into the page. This is &lt;em&gt;not good&lt;/em&gt;, so I did a look at Cognito and I did setup a role etc. However it just returned a 403 all the time and it took me a few hours of digging into to find out why. Here the shocker! Authentication via Cognito does not support all aws services &lt;a href="https://docs.aws.amazon.com/cognito/latest/developerguide/iam-roles.html" rel="noopener noreferrer"&gt;see Access Policies in this doc&lt;/a&gt;. Plus I have no idea why Kinesis Video Streams is not on that list, especially given Kinesis Data Streams are. So it means we are stuck with the IAM user and hardcoded credentials :-(.&lt;/p&gt;

&lt;h2&gt;
  
  
  HTML5 client
&lt;/h2&gt;

&lt;p&gt;This resulted in a HTML 5 like the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;script src="https://cdn.jsdelivr.net/npm/hls.js@latest"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script src="https://cdnjs.cloudflare.com/ajax/libs/aws-sdk/2.278.1/aws-sdk.min.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;link rel="stylesheet" href="https://vjs.zencdn.net/6.6.3/video-js.css"&amp;gt;
&amp;lt;script src="https://vjs.zencdn.net/6.6.3/video.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script src="https://cdnjs.cloudflare.com/ajax/libs/videojs-contrib-hls/5.14.1/videojs-contrib-hls.js"&amp;gt;&amp;lt;/script&amp;gt;

&amp;lt;video id="video"  controls autoplay&amp;gt;&amp;lt;/video&amp;gt;
&amp;lt;script&amp;gt;

  var protocol = 'HLS';
  var streamName = 'FishCam';

  // Step 1: Configure SDK Clients
  var options = {
    accessKeyId: 'AAAAAAAAA',
    secretAccessKey: 'BBBBBBBBBB',
    region: 'us-west-2'
  }
  var kinesisVideo = new AWS.KinesisVideo(options);
  var kinesisVideoArchivedContent = new AWS.KinesisVideoArchivedMedia(options);

  // Step 2: Get a data endpoint for the stream
  console.log('Fetching data endpoint');
  console.log(AWS.config.credentials);

  kinesisVideo.getDataEndpoint({
    StreamName: streamName,
    APIName: "GET_HLS_STREAMING_SESSION_URL"
  }, 
  function (err, response) {

    if (err) { return console.error(err); }
    console.log('Data endpoint: ' + response.DataEndpoint);
    kinesisVideoArchivedContent.endpoint = new AWS.Endpoint(response.DataEndpoint);

    // Step 3: Get a Streaming Session URL
    var consoleInfo = 'Fetching ' + protocol + ' Streaming Session URL';
    console.log(consoleInfo);

    kinesisVideoArchivedContent.getHLSStreamingSessionURL({
      StreamName: streamName,
      PlaybackMode: 'LIVE',
      DiscontinuityMode: 'ALWAYS',
      MaxMediaPlaylistFragmentResults: '1000',
      Expires: '30000',
      HLSFragmentSelector: {
        FragmentSelectorType: 'SERVER_TIMESTAMP'
      }
    }, function (err, response) {

        if (err) { return console.error(err); }
        console.log('HLS Streaming Session URL: ' + response.HLSStreamingSessionURL);
        var video = document.getElementById('video');
        if (Hls.isSupported()) {
          var hls = new Hls();
          hls.loadSource(response.HLSStreamingSessionURL);
          hls.attachMedia(video);
          hls.on(Hls.Events.MANIFEST_PARSED, function () {

            video.play(function (err) {
              if (err) { return console.error(err); }
            })
          });
        }
        // hls.js is not supported on platforms that do not have Media Source Extensions (MSE) enabled.
        // When the browser has built-in HLS support (check using `canPlayType`), we can provide an HLS manifest (i.e. .m3u8 URL) directly to the video element through the `src` property.
        // This is using the built-in support of the plain video element, without using hls.js.
        // Note: it would be more normal to wait on the 'canplay' event below however on Safari (where you are most likely to find built-in HLS support) the video.src URL must be on the user-driven
        // white-list before a 'canplay' event will be emitted; the last video event that can be reliably listened-for when the URL is not on the white-list is 'loadedmetadata'.
        else if (video.canPlayType('application/vnd.apple.mpegurl')) {
          video.src = response.HLSStreamingSessionURL;
          video.addEventListener('loadedmetadata', function () {
            video.play();
          });
        }
        else if(playerName === 'VideoJS') {
            var playerElement = $('#videojs');
            playerElement.show();
            var player = videojs('videojs');
            console.log('Created VideoJS Player');
            player.src({
              src: response.HLSStreamingSessionURL,
              type: 'application/x-mpegURL'
            });
            console.log('Set player source');
            player.play();
            console.log('Starting playback');
        }
    })
  })
&amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Hosting the Webpage
&lt;/h2&gt;

&lt;p&gt;We need somewhere to host our HTML5 client and to be honest I did not want to spend too much money in doing so. Therefore I created a new S3 bucket with relatively meaningful name and then enabled static hosting:&lt;br&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1yxt2swirzaxffdmwm0s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1yxt2swirzaxffdmwm0s.png" alt="Static Site"&gt;&lt;/a&gt;&lt;br&gt;
I then named the the page index.html and uploaded it to s3 so that is was served as root page of the web address.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9u4rlfvwkzbtc4m7wydp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9u4rlfvwkzbtc4m7wydp.png" alt="Bucket Contents"&gt;&lt;/a&gt;&lt;br&gt;
Lastly I took the static hosting URL and placed it my browser and on my phone and tested it out. It can take a few seconds to load but if successfull you should see your fish swimming around or whatever your subject is &lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;This was great fun and works surpassingly well. My daughter was able to view her fish when she has been on holiday in Cornwall and Scotland. I do also have a Philips Hue light bulb in the room so if it is too dark then I can turn the light on as well. I hope you enjoyed this and please get touch if you have any feedback.&lt;/p&gt;

&lt;p&gt;Come back soon for more fishy fun!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>iot</category>
      <category>machinelearning</category>
      <category>edge</category>
    </item>
    <item>
      <title>Fish Cam : Hooking up fish to the cloud</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Mon, 20 Sep 2021 16:14:32 +0000</pubDate>
      <link>https://forem.com/aws-builders/fish-cam-hocking-up-fish-to-the-cloud-5fag</link>
      <guid>https://forem.com/aws-builders/fish-cam-hocking-up-fish-to-the-cloud-5fag</guid>
      <description>&lt;p&gt;Hopefully reading my first blog, Fish Cam : &lt;a href="https://dev.to/philbasford/fish-cam-background-architecture-and-hardware-3m4a"&gt;Background, Architecture and Hardware&lt;/a&gt;, inspired you to start your own exciting ‘pet’ project or maybe you're just playing around. Either way fantastic, so hopefully your got your PI and USB web cam handy and connected up. Don’t worry about connecting the breadboard to the GPIO, I will handle that in another blog. In this blog we will cover installing an OS, settling up AWS IoT Greengrass and connecting it to AWS Systems Manager.&lt;/p&gt;

&lt;h1&gt;
  
  
  Basics:
&lt;/h1&gt;

&lt;p&gt;One of best features of my PI that I love is that hard disk is a SD disc. This means that you have to flash it with OS image of your liking. However you can buy a few SD discs for under £10 and this means you can keep swapping them. Allowing you to try out few different options plus setups. neat!&lt;/p&gt;

&lt;p&gt;There are many options for an OS on a PI, but they need for an ARM variant of Linux 4.4 kernel for Greengrass (more details on requirements here) [&lt;a href="https://docs.aws.amazon.com/greengrass/v2/developerguide/setting-up.html#greengrass-v2-requirements" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/greengrass/v2/developerguide/setting-up.html#greengrass-v2-requirements&lt;/a&gt;]. Raspberry PI OS, formally Raspbian, is the prime choice so that’s what I used for Fish Cam. Now you have choice on edition; easy, medium and hard:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Raspberry Pi OS with desktop and recommended software:&lt;/strong&gt; Easy, has lots installed and if you only got one SD plus your are a first timer then it will allow to explore all things PI. However it is very bloated if you just want to do a IoT and/or ML project.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Raspberry Pi OS with desktop:&lt;/strong&gt; Medium, this edition has everything you need plus a desktop. Some of you might like to use that as I did starting out. However your need some more kit; a monitor, a keyboard and a mouse. It also effects your remote access options (more on that later)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Raspberry Pi OS Lite :&lt;/strong&gt; Hard, This is for more experienced Linux people as basically you only have a command line and no desktop. I love the command line and all the setup of Greengrass and Systems Manager is done via the command line anyway. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To be honest I said LITE was hard but actually it someways it is the quickest and most straightforward. Therefore I used this edition for Fish Cam. To get your PI up and running then need four things (most in the start kit I recommended in &lt;a href="https://dev.to/philbasford/fish-cam-background-architecture-and-hardware-3m4a"&gt;Background, Architecture and Hardware&lt;/a&gt;) &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The SD card that came with your PI or bought separately.&lt;/li&gt;
&lt;li&gt;A desktop or laptop, I have MacBook Pro with only USB C. I will cover the Mac steps in this blog but same roughly works for other platforms &lt;/li&gt;
&lt;li&gt;A micro micro SD reader.&lt;/li&gt;
&lt;li&gt;Your SD reader might go into a SD port or usb straight but if not an adapter to convert (mine is micro SD to USB 2 so I used a usb 2 -&amp;gt; usb 3 adapter)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once that’s all plug together and plug into your Mac, then the volume appears Devices on the finder sidebar. If your reusing an SD then your need to format (do not lose anything important in doing so).&lt;/p&gt;

&lt;h1&gt;
  
  
  Install Raspberry Pi OS
&lt;/h1&gt;

&lt;p&gt;To flash your SD your need to download and run the &lt;a href="https://www.raspberrypi.org/software/" rel="noopener noreferrer"&gt;Raspberry Pi Imager&lt;/a&gt;. Selecting the edition you would like and then writing it to the SD. &lt;/p&gt;

&lt;p&gt;If your using easy or medium (a desktop edition) then you can do all the network and remote access configuration from the desktop, so eject your SD, put it in your PI, boot your PI and please follow :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.raspberrypi.org/documentation/computers/configuration.html#configuring-networking-2" rel="noopener noreferrer"&gt;https://www.raspberrypi.org/documentation/computers/configuration.html#configuring-networking-2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If your going down the hard edition route your need to create a blank file called &lt;code&gt;SSH&lt;/code&gt; with no content and a file called &lt;code&gt;wpa_supplicant.conf&lt;/code&gt; at root of your SD (this is the boot partition) containing your WiFi details (see also then guide above):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
country=&amp;lt;Insert 2 letter ISO 3166-1 country code here, I used GB&amp;gt;
update_config=1

network={
 ssid="&amp;lt;Name of your wireless LAN&amp;gt;"
 psk="&amp;lt;Password for your wireless LAN&amp;gt;"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now sometimes you power management can turn off your network connection when idle. Therefore killing your remote access and not allowing you to bring it out of the idle state. So your need to disable this with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo iwconfig wlan0 power off
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note you may need to persist this for reboots and you can find someways of doing &lt;a href="https://askubuntu.com/questions/85214/how-can-i-prevent-iwconfig-power-management-from-being-turned-on" rel="noopener noreferrer"&gt;this here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Now you're ready, so eject your SD, put the SD in your PI and boot it. &lt;/p&gt;

&lt;h1&gt;
  
  
  Accessing my PI
&lt;/h1&gt;

&lt;p&gt;Now basically there are few ways of connecting your PI:&lt;/p&gt;

&lt;h3&gt;
  
  
  SSH (hard)
&lt;/h3&gt;

&lt;p&gt;The key is to know your IP so you can connect and for those who took ether of the easy or medium options then this shown to you in the desktop editions in wizard when you first boot. However for the hard option (command line only edition) you should able then scan your network and find your new IP address (I used Fing on my iPhone). Then once you know the IP address of your PI then you will be able connect via SSH. I connected from my MacBook into the PI using the PI user and default password.  If this is your first connection, the please change your password and I recommend setting up a ssh key. This will give you slick and secure access, more &lt;a href="https://medium.com/@sandeeparneja/how-to-ssh-from-a-mac-to-raspberrypi-without-entering-the-password-every-time-afd769ecfb6" rel="noopener noreferrer"&gt;details here&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  VNC  (easy or  medium)
&lt;/h3&gt;

&lt;p&gt;If you went for one of the desktop editions then you can also use VNC for remote access and for this I would recommend the creating a cloud account and registering your PI. This means you can then connect from a VNC Viewer on any device. Including from your iPhone on holiday to restart something when the Web Cam stopped working. &lt;a href="https://magpi.raspberrypi.org/articles/vnc-raspberry-pi" rel="noopener noreferrer"&gt;See here for more details&lt;/a&gt; &lt;/p&gt;

&lt;h1&gt;
  
  
  Python
&lt;/h1&gt;

&lt;p&gt;I am using Python 3.8 to develop some of the features of Fish Cam within Greengrass and all my sensors work with Python Libs. Therefore I installed it from source (up to you what python version to use):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;wget https://www.python.org/ftp/python/3.8.12/Python-3.8.12.tgz -o python.tgz
tar -xzvf python.tgz 
cd Python-3.8.12/
./configure --enable-optimizations
make
sudo make install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Greengrass
&lt;/h1&gt;

&lt;p&gt;Greengrass is AWS technology that connects devices located at the “edge”; devices in remote locations and that could be low powered  up to AWS. This allows you to do lots of things like; collecting sensor data and change the state of device, i.e. powering it on or toggling a switch. For Fish Cam will mainly use it for measuring water temperature, PH and Total Dissolvable Solids (more on how to do this in the future). &lt;/p&gt;

&lt;p&gt;AWS has two versions of Greengrass, V1 and V2.  For Fish Cam it really did not mater which version I used as both support the features we need. However if your using Greengrass for anything else you may wish to &lt;a href="https://docs.aws.amazon.com/greengrass/v2/developerguide/move-from-v1.html" rel="noopener noreferrer"&gt;check here&lt;/a&gt;. Also if your using Greengrass on certified hardware then you need also beware of what major and minor versions are supported.&lt;/p&gt;

&lt;p&gt;Anyway for Fish Cam I went with v2 and followed the AWS guides which are very good. To start with I followed Steps 9, 10 and 11:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/greengrass/v1/developerguide/setup-filter.rpi.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/greengrass/v1/developerguide/setup-filter.rpi.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I then completed “Set up your environment” Step 4  and all of “Install the AWS IoT Greengrass Core software”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/greengrass/v2/developerguide/getting-started.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/greengrass/v2/developerguide/getting-started.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My setup command was:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo -E java -Droot="/greengrass/v2" -Dlog.store=FILE \
  -jar ./GreengrassCore/lib/Greengrass.jar \
  --aws-region eu-west-1 \
  --thing-name FishCamv2Core \
  --thing-group-name FishCamv2CoreGroup \
  --thing-policy-name GreengrassV2IoTThingPolicy \
  --tes-role-name GreengrassV2TokenExchangeRole \
  --tes-role-alias-name GreengrassCoreTokenExchangeRoleAlias \
  --component-default-user ggc_user:ggc_group \
  --provision true \
  --setup-system-service true

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Systems Manager
&lt;/h1&gt;

&lt;p&gt;AWS Systems Manager is not really a service but a suite of services that are aim at managing the OS, remote management above the infrastructure and remote connectivity of your "instances". &lt;a href="https://aws.amazon.com/blogs/mt/manage-raspberry-pi-devices-using-aws-systems-manager/" rel="noopener noreferrer"&gt;Here is a AWS guide in how setup Systems Manager up on your PI&lt;/a&gt; and the following picture shows the activation screen I used for activating Systems Manager:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1t2kodgvunfrltawe3qb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1t2kodgvunfrltawe3qb.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Fleet Manager
&lt;/h3&gt;

&lt;p&gt;New at re:invent 2020, &lt;a href="https://aws.amazon.com/blogs/aws/new-aws-systems-manager-fleet-manager/" rel="noopener noreferrer"&gt;Fleet Manager&lt;/a&gt; allows you to interact and manage one premise servers or remote devices from AWS. By installing the Systems Manager agent on my PI it was automatically enrolled into Fleet Manager. Therefore I decided to take a look around it and what it could do for my PI.  &lt;/p&gt;

&lt;p&gt;This is the first thing to look at is the main dashboard screen, where you can see an overview of your fleet and then drill into each device:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9mweu8uz6rw4nzcfyo62.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9mweu8uz6rw4nzcfyo62.png" alt="Main Screen"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One of the features of Fleet Manager is that you can browse the files system off the device, create/delete directories, and upload/download files: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdrviwpwwfaavgyzgdbs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdrviwpwwfaavgyzgdbs.png" alt="File Screen"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another feature is the ability to monitor system performance from the AWS Console:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwah9xg8kt0eg1lltl395.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwah9xg8kt0eg1lltl395.png" alt="System Performance"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The last main feature of Fleet Manager is ability to manage OS User and Groups from the AWS Console:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffpm3oa5i16b4erk0wa9u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffpm3oa5i16b4erk0wa9u.png" alt="Users and Groups"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fleet Manager has two tiers; standard and advance. Advance has the following additional features:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You need use Session Manager to interactively access on-premises instances.&lt;/li&gt;
&lt;li&gt;You want to use Patch Manager to patch Microsoft applications hosted on on-premises instances (not needed for Fish Cam).&lt;/li&gt;
&lt;li&gt;Need Register more than 1,000 on-premises instances (not needed for Fish Cam).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;However the advance tier costs a small amount per hour per instance more, see &lt;a href="https://aws.amazon.com/systems-manager/pricing" rel="noopener noreferrer"&gt;https://aws.amazon.com/systems-manager/pricing&lt;/a&gt;. This can be enabled the settings area in Fleet Manager&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F56ctlk08myuoxy0n3004.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F56ctlk08myuoxy0n3004.png" alt="Permissions"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Session Manager
&lt;/h3&gt;

&lt;p&gt;Session Manager provides the ability to connect into instances both in AWS and remotely. To connect to remote instances like my PI then your need to enable encryption and advance tier or your get an error like below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2dbgidpmhtmq9ewiwxh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq2dbgidpmhtmq9ewiwxh.png" alt="Warning"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/encrypt-log-data-kms.html" rel="noopener noreferrer"&gt;AWS provide some details here&lt;/a&gt; and here my guide to it.&lt;/p&gt;

&lt;h4&gt;
  
  
  KMS
&lt;/h4&gt;

&lt;p&gt;In order connect securely to the instance your first need to create a KMS key and I recommend setting up alias to that easy to reference it and allow for it to rotated, I called mine  &lt;code&gt;/fishcam/instances&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt6udxiaxo7p190okvbw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt6udxiaxo7p190okvbw.png" alt="KMS"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h5&gt;
  
  
  KMS Resource Policy
&lt;/h5&gt;

&lt;p&gt;I applied the following key policy to allow the instances role to use the key and for CloudWatch Logs to use it to encrypt the terminal logs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Id": "key-consolepolicy-3",
    "Statement": [
        {
            "Sid": "Allow access for Key Administrators",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
 "arn:aws:iam::1234567890:role/SandboxLocalAdmin",
               "arn:aws:iam::1234567890:role/FullAdminRole"
                ]
            },
            "Action": "kms:*",
            "Resource": "*"
        },
        {
            "Sid": "Allow use of the key",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::1234567890:role/service-role/AmazonEC2RunCommandRoleForManagedInstances"
            },
            "Action": [
                "kms:Encrypt",
                "kms:Decrypt",
                "kms:ReEncrypt*",
                "kms:GenerateDataKey*",
                "kms:CreateGrant",
                "kms:ListGrants",
                "kms:RevokeGrant",
                "kms:Describe*"
            ],
            "Resource": "*",
        },
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "logs.eu-west-1.amazonaws.com"
            },
            "Action": [
                "kms:Encrypt*",
                "kms:Decrypt*",
                "kms:ReEncrypt*",
                "kms:GenerateDataKey*",
                "kms:Describe*"
            ],
            "Resource": "*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h5&gt;
  
  
  IAM role
&lt;/h5&gt;

&lt;p&gt;I also had to add the following policy to the &lt;code&gt;AmazonEC2RunCommandRoleForManagedInstances&lt;/code&gt; role (you may wish to restrict them further for production system):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "kms:*",
                "logs:*"
            ],
            "Resource": "*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Log Group
&lt;/h4&gt;

&lt;p&gt;Once the KMS key is place then you can create the Log Group with the KMS encryption enabled:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmtivzndqcdr1fvcyzcs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwmtivzndqcdr1fvcyzcs.png" alt="Log Group"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Enable Encryption
&lt;/h4&gt;

&lt;p&gt;Then Session Manger can be configured to use the KMS and the Log Group:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fga1syfdl2t4pi580jrhl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fga1syfdl2t4pi580jrhl.png" alt="Enable Encryption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Session Manager Shell
&lt;/h4&gt;

&lt;p&gt;Once this is done then Session Manger will be able to connect:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyxx5qdu120wkqxleqxjq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyxx5qdu120wkqxleqxjq.png" alt="Shell"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Run Command
&lt;/h3&gt;

&lt;p&gt;Another feature of Systems Manager is the ability to execute  Run Books to patch, install or do common tasks using the RunCommand.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftvccop28jd2euobua33b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftvccop28jd2euobua33b.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;I hoping this blog has showed you how to get your PI up and running. Also all the different ways that you can connect to it and basically now you have your own DeepLens clone to play with and do some fun stuff.&lt;/p&gt;

&lt;p&gt;If you want to know how to live stream then here is the next blog in the series &lt;a href="https://dev.to/aws-builders/fish-cam-lights-camera-swim-5efe"&gt;https://dev.to/aws-builders/fish-cam-lights-camera-swim-5efe&lt;/a&gt; &lt;/p&gt;

</description>
      <category>aws</category>
      <category>iot</category>
      <category>machinelearning</category>
      <category>edge</category>
    </item>
    <item>
      <title>Fish Cam : Background, Architecture and Hardware</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Mon, 06 Sep 2021 22:16:03 +0000</pubDate>
      <link>https://forem.com/aws-builders/fish-cam-background-architecture-and-hardware-3m4a</link>
      <guid>https://forem.com/aws-builders/fish-cam-background-architecture-and-hardware-3m4a</guid>
      <description>&lt;p&gt;Fish Cam is a ‘pet’ project of mine, it comes from wanting to allow my daughter to see her two goldfish, Goldie and Star in their thank from wherever she is on holiday. In doing so it provided me with a way to play with some of latest technology and AWS services including IoT, Real-Time Data Ingestion and ML @ the Edge.&lt;/p&gt;

&lt;h1&gt;
  
  
  Fish Background:
&lt;/h1&gt;

&lt;p&gt;So let’s start with what not do to or assume: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Goldfish are easy pets for kids to look after&lt;/li&gt;
&lt;li&gt;It’s very romantic to win them at a funfair&lt;/li&gt;
&lt;li&gt;Think that a small round bowl or by buying a tank from the fun fair that it will do. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of this is basically wrong and any respectable pet store will tell you that. I blame cartoons for these falsities. However Goldie and Star where funfair fish that my daughter won about 3 years ago. &lt;/p&gt;

&lt;p&gt;In addition I was worried about us becoming Goldfish owners. This was based on myself having previous experience with goldfish whilst growing up. My sister had some and I remembered that they seemed to die very often. So I was very sceptical that they would survive and I kind of knew we had kit them out properly. So we took decision and down to pet store we went to buy a proper tank, filter and pump.&lt;/p&gt;

&lt;p&gt;It transpired from the advice given by the store and online research that this experience of fish dying is very common. The reason most goldfish die is due to two basic things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The tank is too small and this stumps their growth&lt;/li&gt;
&lt;li&gt;The water quality is poor and they suffer disease or suffering, water quality is again related to tank size and regular cleaning &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Two very hasty ways two die! Both the owners fault. The reason for this is that goldfish are very “messy” or produce a lot of 💩. Yep it also comes down to the amount of 💩 and how often it is cleaned away.  Goldfish need about 35-60 litres (or more) of clean water each, thats a considerable size tank for two fish and it will set you back around £200 in the UK, then your need to clean it roughy every 2-3 weeks. Hence they are not really a low maintenance or child friendly pets. Your pet store will advise all this and in most cases won’t sale you goldfish without an adequate tank and filter.&lt;/p&gt;

&lt;p&gt;So what we did was to buy an initial £80 tank (around 45 litres) to get started (that’s still a lot bigger than those round bowls or fairgrounds tanks). Also after a year, once the fish had grown plus we knew what we where doing, then we upgraded to a 65 litre tank (see below). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9rwp8h4upq78jbmhpi7o.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9rwp8h4upq78jbmhpi7o.jpg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So my aims of doing the project was also to help monitor the fish and water quality. The fish however gave us their own strong indication that they are happy and healthy fish, as we got eggs this year. Luckily our fish are both female.&lt;/p&gt;

&lt;h1&gt;
  
  
  Technical Background:
&lt;/h1&gt;

&lt;p&gt;From a young age I have always been into technology and also an inventor of sorts. I am very lucky as lots of my current work is Data, Analytics and ML. However my specific interest in this project and producing a Fish Cam predates this. &lt;/p&gt;

&lt;p&gt;In 2017 I worked for another company at the time, leading a cloud migration project and I was lucky enough to go to my first re:invent. It was brilliant and inspiring, lots of what I do today is direct related to those 5 days. For AWS, 2017 was also a big year! And one thing that caught my eye was the DeepLens.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06s9g7tcx9bs2oor6ci5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F06s9g7tcx9bs2oor6ci5.png" alt="Deeplens from AWS"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The DeepLens (as shown in the picture above) is a small device with a camera that allows you to run a computer vision model. You can also connect them to the AWS Cloud as an IoT devices running Linux. &lt;/p&gt;

&lt;p&gt;Now at re:invent 2017, according to Andy Jassy, you could go home with one if you did a hands on workshop or if you attended any session of ML track you would get a voucher to buy one when they went GA. So I did a session on SageMaker (also a new 2017 service) and waited for my voucher. It never turned up. Even after asking around my AWS contacts I did not get one and further more it transpired that Amazon would not be able to ship them to UK 😭&lt;/p&gt;

&lt;p&gt;That was the end of that! Well until 2019, when my wife and daughters gave me a Raspberry PI 3 for my birthday. I had an old HD webcam in my box of cables, so I wondered if I could build my own DeepLens. So this project is how todo it.&lt;/p&gt;

&lt;h1&gt;
  
  
  Hardware:
&lt;/h1&gt;

&lt;p&gt;To build your on IoT play device / DeepLens clone you can use a lot of different devices but here is what I used:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4bhwsp3xw3p2ex7h759x.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4bhwsp3xw3p2ex7h759x.jpg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Raspberry PI 3 or 4&lt;/strong&gt; (including case, sd card 32gb or more, flash module, usb power) : Something like Raspberry pi 3 b+ Starter Kit with 32GB Micro SD Card &lt;a href="https://smile.amazon.co.uk/dp/B09BFF1PSM/ref=cm_sw_r_cp_api_glt_fabc_BE0VD7TGQBBBRCT7J2HY?_encoding=UTF8&amp;amp;psc=1" rel="noopener noreferrer"&gt;https://smile.amazon.co.uk/dp/B09BFF1PSM/ref=cm_sw_r_cp_api_glt_fabc_BE0VD7TGQBBBRCT7J2HY?_encoding=UTF8&amp;amp;psc=1&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A standard USB HD webcam&lt;/strong&gt; This was the one I had Logitech C920 HD Pro Webcam, Full HD 1080p/30fps &lt;a href="https://smile.amazon.co.uk/dp/B006A2Q81M/ref=cm_sw_r_cp_api_glt_fabc_6YSZ5772CNJG7VGSYFXP?_encoding=UTF8&amp;amp;psc=1" rel="noopener noreferrer"&gt;https://smile.amazon.co.uk/dp/B006A2Q81M/ref=cm_sw_r_cp_api_glt_fabc_6YSZ5772CNJG7VGSYFXP?_encoding=UTF8&amp;amp;psc=1&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now for future needs you may want to buy a breadboard and some jumpers etc. This will allow you to connect sensors to the PIs GPIO controls and make up circuits. Here is what I purchased (but others will also do)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;ELEGOO Upgraded Electronics Fun Kit&lt;/strong&gt; w/Power Supply Module, Jumper Wire, Precision Potentiometer, 830 tie-points Breadboard for Arduino UNO R3, MEGA, Raspberry Pi, STM32, Datesheet Available &lt;a href="https://smile.amazon.co.uk/dp/B01LZRV539/ref=cm_sw_r_cp_api_glt_fabc_MYA29BNRZP9GT0ZJ353W?_encoding=UTF8&amp;amp;psc=1" rel="noopener noreferrer"&gt;https://smile.amazon.co.uk/dp/B01LZRV539/ref=cm_sw_r_cp_api_glt_fabc_MYA29BNRZP9GT0ZJ353W?_encoding=UTF8&amp;amp;psc=1&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lastly:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;One hair band&lt;/strong&gt;: My daughters spare hair band to hold the lid on the case&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Features and Architecture
&lt;/h1&gt;

&lt;p&gt;Once I had connected up the components I then built the following solution with them and AWS:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgcc185cfpocbudsy5zf.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgcc185cfpocbudsy5zf.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The features are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Real Time Video Streaming&lt;/strong&gt;: One of the key needs from my daughter was that she wanted to able to see her fish swimming their tank from where ever she is on her tablet or my iPhone. For this I used a Kinesis Video Steam, A C++ client to capture the video from the USB camera and send it to the stream, then a client written in HTML 5 Client host in S3, connecting to the stream using HTTP live streaming.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Real Time Water Quality&lt;/strong&gt;: For this I connected a Temperature Sensor DS18B20 to the GPIO interface on my Raspberry PI. I then created an Edge Lambda to read the raw data from the device interface. The lambda then created a JSON message and sent it via IOT Greengrass and IOT Core to a MQTT topic. Then I created an IOT rule for the topic that uses the Cloud Watch Put Metric connector to place the temperature into Cloud Watch. Finally I created a CloudWatch Dashboard that showed the temperature over whatever time period I wanted up to the last minute. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frt2aspkznu6cjblofcop.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frt2aspkznu6cjblofcop.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Real Time Fish Detection&lt;/strong&gt;: The ability to detect the fish swimming around a tank in real time. Hay not really a requirement from my daughter, but I was just playing to see how good Rekognition was. So to do this I connected the existing Kinesis Video Steam to Recognition (Real Time) by a subscription that then push its results into a Kinesis Data Steam that I a created. I then created a Kinesis Firehose Delivery Steam pulling data from the Kinesis Data Steam and to save the results every minute to S3.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Edge Device and Management&lt;/strong&gt;: A prerequisite for &lt;strong&gt;Real Time Water Quality Dashboards&lt;/strong&gt; was that I needed to set up my Raspberry PI as a Thing! to do this I had to install Java 11, the IoT GreenGrass V2 software, and run the configuration on my PI and connect it to IOT Core.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Remote hands and patching&lt;/strong&gt;: Lastly I needed the ability to login to the Raspberry PI from a remote location and see what was happening. To be honest VNC cloud works really well for this. Again as this was for fun and for exploring AWS I decided take some advice and also to give the combination of Session Manager in Systems Manager + Fleet Manager (a new toy) ago, therefore I installed the SSM agent on my PI and activated it. This allowed me to access the terminal on the device from AWS and also patch it remotely using Documents and the RunCommand in Systems Manager.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;This was a great "pet" project that allowed my daughter to see her fish from anywhere. It also allowed me to explore lots of the capabilities of AWS and linking them together. I am now looking into how to improve the Real Time Water Quality feature to better predict when filters need changing and water cleaning. I also want it to alert me if Goldie and Star are getting too hot (above 26c) or cold (below 20c) &lt;/p&gt;

&lt;p&gt;..More blogs to follow that will deep dive into how to do some of the features. However first you need to install your OS and get access &lt;a href="https://dev.to/philbasford/fish-cam-hocking-up-fish-to-the-cloud-5fag"&gt;therefore see my next blog here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;….Finally you can see a demo here &lt;a href="https://www.twitch.tv/videos/1135918428" rel="noopener noreferrer"&gt;https://www.twitch.tv/videos/1135918428&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>edge</category>
      <category>iot</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>AWS Advanced Networking</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Sat, 17 Jul 2021 16:20:46 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-advanced-networking-4h47</link>
      <guid>https://forem.com/aws-builders/aws-advanced-networking-4h47</guid>
      <description>&lt;p&gt;I recently completed the AWS Advanced Networking Specialism which is focused on core networking principles and the implementation of them on AWS. Here are my notes and also my mega tip for all AWS exams.&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;For my preparation and training I used and did the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Took the ACloudGuru course&lt;/li&gt;
&lt;li&gt;Done the AWS exam readiness training&lt;/li&gt;
&lt;li&gt;I have 6 years AWS experience and have designed and implemented lots of VPC etc&lt;/li&gt;
&lt;li&gt;Completed my SA pro (I would say that if you do this your at least 60% on the path to this specialism and the security one)&lt;/li&gt;
&lt;li&gt;Completed my DevOps pro, helped a lot on automation &lt;/li&gt;
&lt;li&gt;I had the complete luxury about 5 years ago of actually helping to setup Direct Connect
&lt;/li&gt;
&lt;li&gt;The APN ambassador programme has run some additional training and sessions over the last month and year.&lt;/li&gt;
&lt;li&gt;Reading the Direct Connect and VPN docs in detail&lt;/li&gt;
&lt;li&gt;10 years ago I built a network using VLANs, VPNs etc in our office at the time&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;“There is no compression algorithm for experience” but training helps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Main Subjects
&lt;/h2&gt;

&lt;p&gt;Your going to have to know for this certification:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;VPCs And Subnets&lt;/li&gt;
&lt;li&gt;Security Groups and NCLs&lt;/li&gt;
&lt;li&gt;Route 53&lt;/li&gt;
&lt;li&gt;Routing, BGP and how AWS does it&lt;/li&gt;
&lt;li&gt;Direct Connect&lt;/li&gt;
&lt;li&gt;VPN&lt;/li&gt;
&lt;li&gt;CloudFormation (infrastructure elements)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Odd Ball Questions
&lt;/h2&gt;

&lt;p&gt;Now I always prepare for getting questions I don’t know or even getting some wrong, we are all human and you need to make sure you complete the exam. So here is how I deal with them. If I have a complete blank (the ones I just don’t know), then I take a guess, don’t waste time or worry about it. Any hard or unsure ones that I know will take time I have a quick stab at them, “Flag for review”, and then move on. If I have any time left later I will have a better crack at it.&lt;/p&gt;

&lt;p&gt;Also be advised that some questions could be trial questions (you get about 5 per exam that are new questions your not marked on). But you won’t know if they are trail ones or not. So I am not saying they where or are but I did get some questions a little outside of the normal. They where:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Transit GW : Most wisdom is that this is too new for the certificate. It
is not as it is over 6 months old.&lt;/li&gt;
&lt;li&gt;Global Accelerator: something I had not covered but knew just enough &lt;/li&gt;
&lt;li&gt;ipam and how to integrate a 3rd party or build your own.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Mega Tip
&lt;/h2&gt;

&lt;p&gt;In exam centres where you take your exam you can ask for and use a wipeboard and a pen for working outs and notes. For online proctoring and if you use Pearson VUE online then there is online whiteboard feature, a little basic like MS paint but good enough. To use this whiteboard option, then it is located on the top bar next to “chat” with moderator. All handy. &lt;/p&gt;

&lt;p&gt;So given I can’t do subnet maths in my head and I got hit with two or three questions in first few minutes. Therefore I decided to crack open the online whiteboard and typed out the following table:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;

&lt;tr&gt;&lt;td colspan="3"&gt;YYYY.YYYY.YYYY.XXXX&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 32 &lt;/td&gt;
&lt;td&gt; 1 &lt;/td&gt;
&lt;td rowspan="2"&gt; Remember the first 4, plus boardcast cannot be used&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 31 &lt;/td&gt;
&lt;td&gt; 2 &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 30 &lt;/td&gt;
&lt;td&gt; 4 &lt;/td&gt; &lt;td rowspan="2"&gt; Also don’t forget VPCs and Subnets can be only /28. However for security groups, routes and NCL you can use them for non VPC addresses &lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 29 &lt;/td&gt;
&lt;td&gt; 8 &lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 28 &lt;/td&gt;
&lt;td&gt; 16 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 27 &lt;/td&gt;
&lt;td&gt; 32 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 26 &lt;/td&gt;
&lt;td&gt; 64 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 25 &lt;/td&gt;
&lt;td&gt; 128&lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 24 &lt;/td&gt;
&lt;td&gt; 256&lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;&lt;td colspan="3"&gt;
YYYY.YYYY.XXXX.ZZZZ&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 23 &lt;/td&gt;
&lt;td&gt; 512 &lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 22 &lt;/td&gt;
&lt;td&gt; 1024 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 21 &lt;/td&gt;
&lt;td&gt; 2048 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 20 &lt;/td&gt;
&lt;td&gt; 4096 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 19 &lt;/td&gt;
&lt;td&gt; 8192 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 18 &lt;/td&gt;
&lt;td&gt; 16384 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 17 &lt;/td&gt;
&lt;td&gt; 32768 &lt;/td&gt; &lt;td&gt;&lt;/td&gt; &lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt; 16 &lt;/td&gt;
&lt;td&gt; 65536 &lt;/td&gt;
&lt;td&gt;Remember /16 is the max you can have for VPC or Subnet. However you can use ones up to /8 for security groups, NCL or routes for non VPC addresses&lt;/td&gt;
&lt;/tr&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For rest of the exam I was able to answer questions requiring subnet maths quickly and my head hurt a lot less. You can do the same tip&lt;br&gt;
for a confusion metric for the ML specialism.&lt;/p&gt;

&lt;p&gt;Enjoy and if doing this certification good luck&lt;/p&gt;

</description>
    </item>
    <item>
      <title>re:invent 2020 Sessions</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Sat, 21 Nov 2020 15:05:06 +0000</pubDate>
      <link>https://forem.com/philbasford/re-invent-2020-sessions-296k</link>
      <guid>https://forem.com/philbasford/re-invent-2020-sessions-296k</guid>
      <description>&lt;p&gt;The following are some inital re:invent 2020 Sessions that I recommend into Machine Learning and Data Analytics. I am aiming to few some these and give a wrap up as apart of my &lt;a href="http://www.comsum.co.uk"&gt;www.comsum.co.uk&lt;/a&gt; punditry. &lt;/p&gt;

&lt;h2&gt;
  
  
  Machine Learning:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;How to use fully managed Jupyter notebooks in Amazon SageMaker&lt;/li&gt;
&lt;li&gt;Implementing MLOps practices with Amazon SageMaker&lt;/li&gt;
&lt;li&gt;Reinventing medical imaging with machine learning on AWS&lt;/li&gt;
&lt;li&gt;From POC to production: Strategies for achieving machine learning at scale&lt;/li&gt;
&lt;li&gt;Secure and compliant machine learning for regulated industries&lt;/li&gt;
&lt;li&gt;How to use machine learning and the Data Cloud for advanced analytics (sponsored by Snowflake)&lt;/li&gt;
&lt;li&gt;Train and tune ML models to the highest accuracy using Amazon SageMaker&lt;/li&gt;
&lt;li&gt;Fast distributed training and near-linear scaling with PyTorch on AWS&lt;/li&gt;
&lt;li&gt;Train large models with billions of parameters in TensorFlow 2.0&lt;/li&gt;
&lt;li&gt;Choose the right machine learning algorithm in Amazon SageMaker&lt;/li&gt;
&lt;li&gt;Interpretability and explainability in machine learning
Privacy-preserving machine learning&lt;/li&gt;
&lt;li&gt;Build quality ML models easily &amp;amp; quickly with Amazon SageMaker Autopilot&lt;/li&gt;
&lt;li&gt;Scaling MLOps on Kubernetes with Amazon SageMaker Operators&lt;/li&gt;
&lt;li&gt;Machine learning inference with Amazon EC2 Inf1 instances&lt;/li&gt;
&lt;li&gt;Architectural best practices for machine learning applications&lt;/li&gt;
&lt;li&gt;Accelerate machine learning projects with pretrained models&lt;/li&gt;
&lt;li&gt;Detect machine learning (ML) model drift in production&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Analytics
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;How BMW Group uses AWS serverless analytics for a data-driven ecosystem&lt;/li&gt;
&lt;li&gt;How FINRA operates PB-scale analytics on data lakes with Amazon Athena&lt;/li&gt;
&lt;li&gt;Break down data silos: Build a serverless data lake on Amazon S3&lt;/li&gt;
&lt;li&gt;Serverless data preparation with AWS Glue&lt;/li&gt;
&lt;li&gt;Run big data analytics faster at lower cost with Amazon EMR&lt;/li&gt;
&lt;li&gt;Improving analytics productivity for overwhelmed data teams (sponsored by Matillion)&lt;/li&gt;
&lt;li&gt;5 best practices for migrating large analytics systems to AWS (sponsored by Teradata)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Serverless and Containers
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;AWS Fargate: Are serverless containers right for you?&lt;/li&gt;
&lt;li&gt;Getting started building your first serverless web application&lt;/li&gt;
&lt;li&gt;Serverless everything: Replatforming for speed and ownership (sponsored by Datadog)&lt;/li&gt;
&lt;li&gt;Scalable serverless event-driven architectures with SNS, SQS &amp;amp; Lambda&lt;/li&gt;
&lt;li&gt;AXA: Rearchitecting with serverless to accelerate innovation&lt;/li&gt;
&lt;li&gt;Power modern serverless applications with GraphQL and AWS AppSync&lt;/li&gt;
&lt;li&gt;Best practices for securing your serverless applications&lt;/li&gt;
&lt;li&gt;CI/CD for serverless applications&lt;/li&gt;
&lt;li&gt;Becoming proficient with serverless application observability&lt;/li&gt;
&lt;li&gt;Decoupling serverless workloads with Amazon EventBridge&lt;/li&gt;
&lt;li&gt;Best practices for security governance in serverless applications&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
    </item>
    <item>
      <title>Using CodeArtifact from SAM</title>
      <dc:creator>philbasford</dc:creator>
      <pubDate>Tue, 10 Nov 2020 22:39:18 +0000</pubDate>
      <link>https://forem.com/aws-builders/using-codeartifact-from-sam-gje</link>
      <guid>https://forem.com/aws-builders/using-codeartifact-from-sam-gje</guid>
      <description>&lt;p&gt;Following my blog (&lt;a href="https://www.inawisdom.com/amazon/codeartifact-storing-your-dependencies/"&gt;https://www.inawisdom.com/amazon/codeartifact-storing-your-dependencies/&lt;/a&gt;) I have worked out how you can use private dependancies in CodeArtifact with SAM. The following is a quick guide.&lt;/p&gt;

&lt;p&gt;Firstly use the get-authorization-token to get a secure token for CodeArtifact using your AWS CLI and store it in a env var called CODEARTIFACT_AUTH_TOKEN:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export CODEARTIFACT_AUTH_TOKEN=$(aws codeartifact get-authorization-token --domain my-domain --domain-owner 112333322 --query authorizationToken --output text --profile my-profile --region eu-west-1 )
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create a Pipfile with the CodeArtifact URL but note the use of the CODEARTIFACT_AUTH_TOKEN env var:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[[source]]
name = "pypi"
url = "https://aws:$CODEARTIFACT_AUTH_TOKEN@my-domain-12333222.d.codeartifact.eu-west-1.amazonaws.com/pypi/PrivatePyPi/simple/"
verify_ssl = true

[dev-packages]
pylint = "*"
#awscli = "==1.16.292"
#aws-sam-cli = "==0.40.0"
flake8 = "*"
flake8-print = "==3.1.4"
flake8-logging-format = "==0.6.0"
flake8-builtins = "==1.4.2"
flake8-eradicate = "==0.3.0"
flake8-comprehensions = "==3.2.2"
flake8-breakpoint = "==1.1.0"
flake8-docstrings = "*"
flake8-rst-docstrings = "*"
flake8-blind-except = "*"
pep8-naming = "==0.9.1"
cfn-lint = "==0.27.4"
pytest = "*"
pytest-cov = "==2.8.1"
moto = "*"
bandit = "*"
safety = "*"
twine = "*"

[packages]
pcb_common = "*"

[requires]
python_version = "3.8"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then use pipenv to create a requirements file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pipenv lock -r &amp;gt; src/requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then use the requirements.txt to pull the dependancies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sam build  CommonLayer --template template.yaml --use-container --base-dir .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The result is the ability to pull your private library. Simple as that! &lt;/p&gt;

</description>
      <category>aws</category>
      <category>sam</category>
      <category>codeartifact</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
