<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Daniel Wellington technology</title>
    <description>The latest articles on Forem by Daniel Wellington technology (@dwtech).</description>
    <link>https://forem.com/dwtech</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/dwtech"/>
    <language>en</language>
    <item>
      <title>DWCFLint - The tool supporting our cost control &amp; improves our security roles</title>
      <dc:creator>Lezgin Zilan</dc:creator>
      <pubDate>Tue, 19 Jan 2021 08:56:24 +0000</pubDate>
      <link>https://forem.com/dwtech/dwcflint-how-we-solved-our-cloudformation-frustration-and-gained-cost-control-security-improvement-30fb</link>
      <guid>https://forem.com/dwtech/dwcflint-how-we-solved-our-cloudformation-frustration-and-gained-cost-control-security-improvement-30fb</guid>
      <description>&lt;p&gt;At Daniel Wellington, one of our core principles since 2016 has been serverless first, and almost everything we have developed over the last four years is based upon AWS’ serverless products.&lt;/p&gt;

&lt;p&gt;We are heavily built upon CloudFormation and have been since five years and all resources need to be defined in a CloudFormation to be able to be created in any environment outside development. At the time of writing this article, we have 4 310 stacks across all our accounts &amp;amp; regions where 14.45% of the stacks are located in AWS China. At this stage, we started to discover various corner cases and undocumented behavior, the internal saying was CloudFrustation for the service. We also identified a need to align our way of working and follow best practices.&lt;/p&gt;

&lt;p&gt;It started with the problem of strings with leading zeros in CloudFormation. Putting a string with a leading zero in a YAML-based CloudFormation file will lead to the aws-cli checking if the string contains only numeric characters. If it does, an implicit cast of the string to a numerical data type is made, which automatically strips away any leading zeros. We noticed this behavior the hard way, as an ID with leading zeros embedded in a CloudFormation file led to very confusing errors in our test environment. After this discovery, we decided it was time to come up with a way of detecting these kinds of errors early and in an automated fashion.&lt;/p&gt;

&lt;p&gt;We started to build our own CloudFormation YAML parser lib but found the awesome&lt;a href="https://github.com/aws-cloudformation/cfn-python-lint" rel="noopener noreferrer"&gt; cfn-lint&lt;/a&gt; project with a great set of built-in linting rules and a plugin API.&lt;/p&gt;

&lt;p&gt;The way the plugin API works is that custom rules are implemented as Python classes and when running the cfn-lint binary, a file path to the a folder containing the class files or an import path to a module containing the rules must be supplied. For ease of packaging, installation and distribution of our custom rules, we decided to create a wrapper around cfn-lint containing the rules we created.&lt;/p&gt;

&lt;p&gt;Over time, more rules were developed in response to discoveries of new corner cases, common mistakes, cost savings and aligning teams. Below is a timeline where we want to highlight some rules that have supported us well.&lt;/p&gt;

&lt;p&gt;During 2018, we had a lot going on, and over two years of increasing lambda usage, we noticed that our CloudWatch logs were consuming massive amounts of storage space as the default retention period is infinite, even though we rarely needed to look at logs older than a month or a quarter. This led to the creation of a rule warning if the retention period is not explicitly set on log groups.&lt;/p&gt;

&lt;p&gt;At one point, we hit the maximum quota of API Gateways of the endpoint type “edge”. Not many of the allocated API Gateways by that time actually needed edge capacity (same infrastructure as CloudFront), it was a legacy as we started with serverless in 2016 and the regional support of API Gateway was released by end of 2017. We could also see that the move was slow due to reuse of outdated CloudFormation templates from other projects whenever we built something new. To help with the migration and reduce costs as regional type endpoints are cheaper, we created a rule warning if the endpoint configuration was not explicitly defined.&lt;/p&gt;

&lt;p&gt;By 2019, we had another period of cost-efficiency improvements, it was decided that all DynamoDB tables should use pay-per-request billing, so we made a rule warning if configuration files containing provisioned throughput were detected. At the end of 2019, the node.js 8.x lambda runtime support was dropped by AWS. To help with our migration effort, we made a rule causing an error status if any deprecated lambda runtime were detected in the CloudFormation files of a project.&lt;/p&gt;

&lt;p&gt;In July 2020, during a period of security improvement work, we added a rule warning if the usage of AWS managed “FullAccess”-policies or misusage of “*” for Action or Resource.&lt;/p&gt;

&lt;p&gt;if the use of different AWS resources’ built-in full access (*) IAM policies were detected.&lt;/p&gt;

&lt;p&gt;At the time of writing this article, we support the following rules&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No mismatched log groups and subscription filters&lt;/li&gt;
&lt;li&gt;No missing endpoint types&lt;/li&gt;
&lt;li&gt;No missing log retention period&lt;/li&gt;
&lt;li&gt;No use of deprecated lambda runtime&lt;/li&gt;
&lt;li&gt;No use of full access policies&lt;/li&gt;
&lt;li&gt;No use of leading zeroes in numbers or strings&lt;/li&gt;
&lt;li&gt;No missing/implicit log groups for lambdas&lt;/li&gt;
&lt;li&gt;No use of old style subscription filters&lt;/li&gt;
&lt;li&gt;No use of provisioned throughput&lt;/li&gt;
&lt;li&gt;No use of reserved environment variable names&lt;/li&gt;
&lt;li&gt;No use of reserved words for Dynamodb column names&lt;/li&gt;
&lt;li&gt;No malformed subscription filters&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now we want to open source this (yet another) lint’ing tool that can be found &lt;a href="https://github.com/dwtechnologies/dwcflint" rel="noopener noreferrer"&gt;here&lt;/a&gt;. We believe its stable and mature, although we expect to find new cases that can be transformed to rules for automatic detection and mitigation in the future. If this tool sounds useful to you, please check out the repository and feel free to report issues or submit pull requests.&lt;/p&gt;

&lt;p&gt;Do you want to know more about how it is to work with technology at Daniel Wellington? Take a few minutes to watch our 3 min video story up to the right, if you are open to new challenges, check out our open &lt;a href="https://careerseurope.danielwellington.com/jobs?department_id=21328" rel="noopener noreferrer"&gt;tech positions&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>cloudformation</category>
      <category>aws</category>
      <category>lint</category>
      <category>devsecops</category>
    </item>
    <item>
      <title>Boosting the assembly process with AWS serverless &amp; Axis camera</title>
      <dc:creator>Lezgin Zilan</dc:creator>
      <pubDate>Thu, 14 Jan 2021 12:52:14 +0000</pubDate>
      <link>https://forem.com/dwtech/boosting-the-assembly-process-with-aws-serverless-axis-camera-3oin</link>
      <guid>https://forem.com/dwtech/boosting-the-assembly-process-with-aws-serverless-axis-camera-3oin</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy7jegsp6eiogzvnrqimq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy7jegsp6eiogzvnrqimq.png" alt="The solution" width="522" height="509"&gt;&lt;/a&gt;&lt;br&gt;
Daniel Wellington is a Swedish fashion brand founded in 2011. Since its inception, it has sold over 11 million watches and established itself as one of the fastest-growing and most beloved brands in the industry.&lt;/p&gt;

&lt;p&gt;In 2012 we launched our Instagram account, and we became pioneers within influencer marketing. Influencers became our primary marketing channel.&lt;/p&gt;

&lt;p&gt;To scale with our challenges and support the developing business, we started using Amazon Web Services (AWS) in 2014. Today we are heavily invested in AWS-platform with a broad range of products such as Amazon Elastic Container Service, AWS Lambda, Amazon DynamoDB, Amazon Sagemaker, Amazon Rekognition, and many more.&lt;/p&gt;

&lt;p&gt;Our design principle has been serverless first since 2016. If serverless is not available or practical, we use containers as we consider EC2 as legacy. JavaScript and Go are some of our languages of choice.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qbl94bb8oins745og3a.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qbl94bb8oins745og3a.gif" alt="The mobile app to optimize our return process that we built with AWS Rekognition service." width="480" height="270"&gt;&lt;/a&gt;&lt;em&gt;The mobile app to optimize our return process that we built with AWS Rekognition service.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Focus on growth and perfection also leads to high standards and requirements for the factories producing our products. To make our pre-quality check work as smoothly as possible to minimize any potential human mistakes, we built an internal mobile app to &lt;a href="https://medium.com/daniel-wellington-tech-stories/how-we-made-the-return-process-more-efficient-thanks-to-an-ipa-beer-notification-hack-5de1a612ed16" rel="noopener noreferrer"&gt;optimize the return process&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After extensive testing, we realized that this would introduce other problems if we use the same solution but for another challenge. Using the simple optics in a mobile camera on the shining, polished watch wouldn’t give us the level of perfection we needed and not enable other use cases.&lt;/p&gt;

&lt;p&gt;We needed efficiency at scale without making this a burden for our assembly line.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdk0owxnrggp6a633v9h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdk0owxnrggp6a633v9h.png" alt="Early tests with the Canon EOS 5D Mark IV &amp;amp; Raspberry pi high quality camera connected to a raspberry pi running a bash script (yes, bash!) wrapping aws-cli in the glorious innovation lab at DW" width="800" height="651"&gt;&lt;/a&gt;&lt;em&gt;Early tests with the Canon EOS 5D Mark IV &amp;amp; Raspberry pi high quality camera connected to a raspberry pi running a bash script (yes, bash!) wrapping aws-cli in the glorious innovation lab at DW&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In initial tests, we did we used a DSLR-(canon EOS 5D Mark IV), raspberry HQ- &amp;amp; a USB- camera. We realized that a DSLR-camera would be a better fit with the optics, detail adjustments, and pre-processing on a raspberry pi.&lt;/p&gt;

&lt;p&gt;We also searched the market for DSLR cameras running android to deploy our app, but they disappeared as fast as they entered the market. This made us realize that we would need to maintain several layers ourselves, such as hardware, OS, and security (patching, certificate, disk encryption &amp;amp; patching). AWS Greengras would solve parts of the problems, but the ownership was still too much. For us, that meant a step backward, a lot of moving parts, an increase of the TCO (total cost of ownership), and less focus on what is most important, the code.&lt;/p&gt;

&lt;p&gt;We started to look for a solution where we could remove such ownership and, at the same time, speed up the transfer time on the edge by doing the pre-processing of the image before it securely touched our code.&lt;/p&gt;

&lt;p&gt;We quickly identified &lt;a href="https://www.axis.com/" rel="noopener noreferrer"&gt;Axis Communications&lt;/a&gt; cameras to fulfill our pre-processing needs on the edge, and we now started to work together to get it integrated with AWS.&lt;/p&gt;

&lt;p&gt;We had learned from our &lt;a href="https://medium.com/daniel-wellington-tech-stories/how-we-made-the-return-process-more-efficient-thanks-to-an-ipa-beer-notification-hack-5de1a612ed16" rel="noopener noreferrer"&gt;earlier OCR&lt;/a&gt; tests that AWS Rekognition service would be the most cost-efficient and powerful setup to use and that it would enable us to do automated quality checks and process-optimization at scale.&lt;/p&gt;

&lt;p&gt;We do this by only capturing a specific part (we crop an area) of the picture, then transforming it into black/white and optimizing it. We then have a button connected to the IO port of the camera to trigger the event to send over the image to AWS over secure MQTT.&lt;/p&gt;

&lt;p&gt;The benefit of secure MQTT (besides its secured with client certificate) is that it is built for IoT (low bandwidth and low compute power), long-lived and re-usable sessions. You would be amazed how rare the implementation is in off the shelf IoT products.&lt;/p&gt;

&lt;p&gt;By presenting our use case, we quickly got the Axis integration team onboard; they utilize their plugin architecture (ACAP) to build secure MQTT support so we could utilize AWS IoT core. Their application is now being &lt;a href="https://github.com/aintegration/acaps/tree/master/Publisher" rel="noopener noreferrer"&gt;released to the public on GitHub&lt;/a&gt; with some beta testing from us.&lt;/p&gt;

&lt;p&gt;On the AWS side, you can see the diagram below where we consume the data through secure MQTT, then process it via AWS Rekognition, store the result and the original picture on S3 for debugging reasons to be deleted with &lt;a href="https://aws.amazon.com/blogs/aws/amazon-s3-object-expiration/" rel="noopener noreferrer"&gt;object expiration&lt;/a&gt; automatically.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AZAUFQL_rYYsr8cWYsW2TGg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F2000%2F1%2AZAUFQL_rYYsr8cWYsW2TGg.png" alt="More in depth about the different parts can be found in the [GitHub](https://github.com/dwtechnologies/axis-aws-rekognition) repository." width="800" height="410"&gt;&lt;/a&gt;&lt;em&gt;More in depth about the different parts can be found in the &lt;a href="https://github.com/dwtechnologies/axis-aws-rekognition" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; repository.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;For us, this is a big step towards bringing AWS into our assembly process to enable other use cases with the same setup, at the same time, gain quick wins for us. To support the community and the speed of transformation at manufacturers, we will also release our POC (proof of concept) &lt;a href="https://github.com/dwtechnologies/axis-aws-rekognition" rel="noopener noreferrer"&gt;code at GitHub&lt;/a&gt; that brings the powerful OCR function from Rekognition and the result delivered over MQTT together with powerful cameras. With some simple code changes, you could quickly get the object Rekognition with thousands of supported objects/scenes and custom labels to train it for your &lt;a href="https://aws.amazon.com/rekognition/custom-labels-features/#Key_features" rel="noopener noreferrer"&gt;own object detection with a few images&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;What about the operational cost? Excluding electricity on the camera we only have a AWS cost of 0,00106371$ USD per capture in the POC without optimizing any code or cost saving plans.&lt;/p&gt;

&lt;p&gt;You may ask why? The right platforms and components are key to enable incremental innovation. You may think, to do what? Just To get your creativity started, using custom labels to detect tiny defects or that the clock hands are set to Swedish time when assembled. Yes, we are a Swedish brand and all our customers across the world starts the experience with Swedish daylight-saving time ;-)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4gtv0oavfeq1jf8con8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4gtv0oavfeq1jf8con8r.png" alt="This is the POC mounted inside a light tent always to get the same reflections and lighting regardless of whether it is in the lab, test, or the assembly process. A button is also mounted inside of the tent to trigger the event to capture and send the data over MQTT." width="726" height="858"&gt;&lt;/a&gt;&lt;em&gt;This is the POC mounted inside a light tent always to get the same reflections and lighting regardless of whether it is in the lab, test, or the assembly process. A button is also mounted inside of the tent to trigger the event to capture and send the data over MQTT.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The code can be found &lt;a href="https://github.com/dwtechnologies/axis-aws-rekognition" rel="noopener noreferrer"&gt;here&lt;/a&gt;, and we would love to see your creative takes on this, even outside the manufacturing area ;-)&lt;/p&gt;

&lt;p&gt;Do you want to know more about how it is to work with technology at Daniel Wellington? Take a few minutes to watch the video and if you are open to new challenges, check out our open &lt;a href="https://careerseurope.danielwellington.com/jobs?department_id=21328" rel="noopener noreferrer"&gt;tech positions&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>aws</category>
      <category>axis</category>
      <category>go</category>
    </item>
    <item>
      <title>How we made the return process more efficient thanks to an IPA beer notification hack</title>
      <dc:creator>Lezgin Zilan</dc:creator>
      <pubDate>Mon, 04 Jan 2021 11:10:24 +0000</pubDate>
      <link>https://forem.com/dwtech/how-we-made-the-return-process-more-efficient-thanks-to-an-ipa-beer-notification-hack-1a8k</link>
      <guid>https://forem.com/dwtech/how-we-made-the-return-process-more-efficient-thanks-to-an-ipa-beer-notification-hack-1a8k</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Feixbrj50ltrcaympff10.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Feixbrj50ltrcaympff10.gif" alt="The built return app using aws rekognition" width="480" height="270"&gt;&lt;/a&gt;&lt;br&gt;
Daniel Wellington is a Swedish fashion brand founded in 2011. Since its inception, it has sold over 11 million watches and established itself as one of the fastest-growing and most beloved brands in the watch industry history.&lt;/p&gt;

&lt;p&gt;Most people know we were the pioneer with influencer marketing. It is impossible to archive without advanced technology. To make it possible, we started using Amazon Web Services (AWS) since 2014. We use services like Amazon Elastic Container Service, AWS Lambda, Amazon DynamoDB, Amazon Sagemaker and Amazon Rekognition.&lt;/p&gt;

&lt;p&gt;We develop our services and applications on the AWS cloud platform and make extensive use of all its amazing serverless features, neatly matched with microservice based architectures.&lt;/p&gt;

&lt;p&gt;Our design principle is serverless first. If serverless is not available or practical, containers are recommended, EC2 is legacy. JavaScript and Go are some of our languages of choice.&lt;/p&gt;

&lt;p&gt;As a global fashion company that has used technology to disrupt an entire industry, we are continually learning and challenging our way of doing business. Our latest venture is to understand how machine learning can help us develop business value.&lt;/p&gt;

&lt;p&gt;Since a year back, we started by looking into how we could do &lt;a href="https://medium.com/daniel-wellington-tech-stories/how-we-brought-machine-learning-awareness-to-the-business-e9fef0f0f788" rel="noopener noreferrer"&gt;machine learning awareness to the business&lt;/a&gt;, since then we started internally a machine learning project where we are building a model to predict and detect certain events. Besides that, one of our employees, Anders, have also been playing around with the recognition service on his spare time, more specific the text in image feature.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F84jlpp1z5ins6pge5ljl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F84jlpp1z5ins6pge5ljl.png" alt="The actual beer menu that is automatically captured and published several times per hour by the hipster bar." width="784" height="978"&gt;&lt;/a&gt;&lt;em&gt;The actual beer menu that is automatically captured and published several times per hour by the hipster bar.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Why?&lt;/p&gt;

&lt;p&gt;Well, what Anders tried to do is to get a notification when a hipster bar at south side of Stockholm added a new IPA to the beer menu.&lt;/p&gt;

&lt;p&gt;Why over engineer the problem?&lt;/p&gt;

&lt;p&gt;Now the problem Is that they update the menu so often that they write the menu by hand and publish a picture of it to their website instead of having the menu written on the website&lt;/p&gt;

&lt;p&gt;With Anders marvellous personal experience using the &lt;a href="https://aws.amazon.com/rekognition/" rel="noopener noreferrer"&gt;Rekognition service&lt;/a&gt;, he wondered if we could employ it to solve any actual business problem. What we came up with was to to use it to scan and read the back plate of our watches to help our warehouse workers that take care of returns.&lt;/p&gt;

&lt;p&gt;During high peak season, each of them may have to handle several hundreds of returns per day. Now, that is a very tedious process, and we understood that it took some while to look up the details in different systems and relabel the return. The relabelling part alone required minimum 40 but up to 60 seconds per item. At the same time, we realized that the business had been looking for a third party solution for almost 18 months.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6c3s6yreuemo4k7jzi2k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6c3s6yreuemo4k7jzi2k.png" width="395" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So we decided to solve the problem with ML technology. The outcome is a React Native app that communicates with an Amazon API Gateway and several Lambda functions, making use of SQS, Rekognition, and finally a serverless printer server (yes, you read it right).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fa4ha57aicr8ti56pkl89.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fa4ha57aicr8ti56pkl89.png" alt="The architecture diagram" width="800" height="484"&gt;&lt;/a&gt;&lt;em&gt;The architecture diagram&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Now, you may think that OCR is not a hard thing to do, and wonder why are we using a service when there are plenty of open source projects available. Well, we did a comparison between several frameworks and the AWS service and we found out that &lt;a href="https://aws.amazon.com/rekognition/" rel="noopener noreferrer"&gt;Rekognition&lt;/a&gt; was the one with the highest precision, while the other frameworks were just not good enough. Besides the best accuracy, we also found the cost of running Rekognition is low.&lt;/p&gt;

&lt;p&gt;Our warehouse workers can save up to 56 seconds of the price $0.0013 per scan and lambda.&lt;/p&gt;

&lt;p&gt;As a result, we brought the relabelling process down to four seconds and sometimes even faster, with higher quality and accuracy. The best part is it only took us less than two weeks to build and deploy it as MVP. The consumption of coffee dropped in the warehouse because they no longer need to spend time on tedious, repetitive task.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/embed/NbUBYGvZLZ0" rel="noopener noreferrer"&gt;Here&lt;/a&gt; is a demo of it&lt;/p&gt;

&lt;p&gt;What’s next? We have now taken this project out of the proof of concept phase and we are building a proper implementation of the app with all the additional requirements needed in a production environment, such as authentication, the ability to choose the AWS region closest to the warehouse, running automated builds, etc.&lt;/p&gt;

&lt;p&gt;We will stay curious and keep experimenting until we set our eyes on the next problem to solve with the power of machine learning.&lt;/p&gt;

&lt;p&gt;Do you want to know more about how it is to work with technology at Daniel Wellington? Take a few minutes to watch the &lt;a href="https://www.youtube.com/embed/Y0TqmXWMfHo" rel="noopener noreferrer"&gt;video&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>rekognition</category>
      <category>serverless</category>
      <category>lambda</category>
    </item>
  </channel>
</rss>
