When you take a look to AWS, you can smell the origin of their public cloud strategy and why you can buy thirty-party technology solutions such as Palo Alto Firewalls, Linux Red Hat or SUSE VMs with lots of applications, or even Cisco or other network providers products. As you know, marketplaces are nothing else than platforms which enable transactions between customers and thirty party sellers.

Jim Collins identified the term “flywheel effect” and explained the concept to Jeff Bezos who saw an incredible opportunity where other people would have seen just a methodology without options to survive.

The idea is simple. Create a virtuous cycle that increases the number of sellers who offer their products and services, which therefore, increase the amount of offers and prices of those products or services so it´s more interesting to the users in order to find exactly what they want with the right prices.

Hence, improves the traffic to the platform and drives more sellers and customers to buy there. Moreover, you reduce prices to users, and they are used to visit your platform or marketplace from time to time.

From Amazon to AWS (Amazon Web Service) Marketplace

AWS marketplace was the first cloud marketplace for Hyperscalers AWS started the journey to sell thirty party IT products and services following in the footsteps of Amazon platform.

Customers can buy thousands of ISV products and services to deploy with agility and just for testing or find out if a specific software make sense and fill the gap in their company.

There is flexibility of prices, offer terms and conditions. There are pricing plans with an annual approach for 12 months subscriptions or even for just one month if you need for example to roll out a POC. there are others such as usage pricing where customers just pay for what they use in a PAYG approach or pricing models for specific product delivery methods such as containers or ML.

It is very flexible as you can buy even professional services product prices which are in general offerings of professional services packages. All the offers can be tailored for your target company if you are an AWS partner, and you can access to public or even private offers if you are a customer to leverage better discounts or improve some aspects of the ISV product or the consultancy company you deal with.

Lots of solutions are waiting for you…

You can make plans for some offer types publicly available or available to only a specific (private) audience. Likewise Azure marketplace we have explained in the previous post and follow up the same marketplace strategy as AWS did.

In summary, if you are figuring out which value can bring an ISV to your business on the cloud, and you want to leverage some AWS partner professional services in a specific area as cybersecurity or SAP, if a chance you should not forget.

Enjoy the journey to the cloud with me…see you then in the next post.


When you, as a user, access your Azure Portal or AWS portal, you have the option to buy thousands of products or solutions preconfigure for you. You don´t need to worry for the licences or the IT capabilities to design or deploy an specific solution as there are all build following customer needs for several AWS or Microsoft partners and ISV (independent software vendors).

We will speak about the AWS marketplace later, in a new post. Just to pointed out, it was launched in 2012 to accommodate and foster the growth of AWS services from third-party providers that have built their own solutions on top of the Amazon Web Services platform such as ISV, SI (System Integrator) an resellers so the customer would buy exactly what they needed and when they needed adding a tremendous flexibility to grow their cloud solutions aligned with the business.

In the case of Azure was launched in 2014, it is a starting point for go-to-market IT software applications and services built by industry-leading technology companies.The commercial marketplace is available in 141 regions, on a per-plan basis.

What are the plans and how to use them (as a Partner)?

Microsoft partners can publish interesting solutions which involve licenses and services together within the Azure Marketplace On one hand, you don´t need to acquire licenses which prices are prorated within the price. On the other hand, you have access to expertise without hiring new employees in you IT team.

A plan defines an offer’s scope and limits, and the associated pricing when applicable. For example, depending on the offer type, you can select regional markets and choose whether a plan is visible to the public or only to a private audience. Some offer types support an scope of subscriptions, some support price related to consumption, and some let a customer purchase the offer with a license (BYOL), they have purchased directly from the publisher. 

Offer typePlans with pricing optionsPlans without pricing optionsPrivate audience option
Azure managed application
Azure solution template
Azure container✔ (BYOL)
IoT Edge module
Managed service✔ (BYOL)
Software as a service
Azure virtual machine
  • Markets: Every plan must be available in at least one market. You have the option to select only “Tax Remitted” countries, in which Microsoft remits sales and use tax on your behalf.
  • Pricing: Pricing models only apply to plans for Azure managed application, SaaS, and Azure virtual machine offers. An offer can have only one pricing model. For example, a SaaS offer cannot have one plan that’s flat rate and another plan that’s per user.
  • Plan visibility: Depending on the offer type, you can define a private audience or hide the offer or plan from Azure Marketplace.

How to publish and what kind of visibility can we provide (As a Partner)?

You can make plans for some offer types publicly available or available to only a specific (private) audience. Offers with private plans will be published to the Azure portal.

You can configure a single offer type in different ways to enable different publishing options, listing option, provisioning, or pricing. The publishing option and configuration of the offer type also align to the offer eligibility and technical requirements.

Be sure to review the online store and offer type eligibility requirements and the technical publishing requirements before creating your offer.

To publish your offers to Azure Marketplace, you need to have a commercial marketplace account in Partner Center and ensure your account is enrolled in the commercial marketplace program.

Also if your offer is published with a private plan, you can update the audience or choose to make the plan available to everyone. After a plan is published as visible to everyone, it must remain visible to everyone and cannot be configured as a private plan again.

Finally, as a partner can enable a free trial on plans for transactable Azure virtual machine and SaaS offers.

For example, azure virtual machine plans allow for 1, 3, or 6-month free trials. When a customer selects a free trial, we collect their billing information, but we don’t start billing the customer until the trial is converted to a paid subscription.

What are your benefits when using Azure Marketplace (As a User)?

Marketplace brings flexibility to customers as they can buy immediately any kind of offer based in a plan which provide several products from thousands of ISV without losing time to deal with any vendor or understand in detail the support model or licensing options.

In the Azure portal, select + Create a resource or search for “marketplace”. Then, browse the categories on the left side or use the search bar, which includes a filter function and choose what you need..

Likewise, there are lots of consultancy services provided from several Microsoft partners, some of them as a free trial so you can test the quality of their professional services and see their approach to fix your pain points.

Enjoy the journey to the cloud with me…see you then in the next post.


As we said in a previous post, AWS Well Architected Framework was launched officially in 2015. The Microsoft Azure WAF approach took more time as they started later, about 2020 with their own WAF methodology. Anyway it´s a collection of best practices, guides and blue prints in the same way that their competitors, Google (in this case, they called it “4 key architecture principles/pillars”, but covers the similar points), and AWS based in experiences and feedback from several stakeholders.

To Summarize, Azure or AWS WAF or the 4 Key Google architecture principles/pillars, helps cloud architects to build secure, high-performing, resilient, and efficient infrastructure for their applications and workloads for their business. Moreover, provide a better UX (user experience) for the employees and users.

Azure Approach...

From the Microsoft point of view, there are also 5 clear pillars as well as for AWS:

  • Cost Optimization – Focus on managing costs and reduce it as much as possible according with the scenario
  • Operational Excellence –Focus on achieving excellence on operations processes that keep a system running in production.
  • Performance Efficiency – Focus on achieving the best adoption of an IT solution in the cloud.
  • Reliability – Focus on recovering a system or IT solution from failures and continue to function in the cloud.
  • Security – Protecting applications and data from threats, keeping in mind the shared responsibility where a customer and Microsoft or some partners work together for a giving IT solution.

Did you notice any change comparing to AWS below?. Well, Microsoft wants to pointed out the same pillars but involve some extra staff around the pillars to make more powerful their offering. That means: references architectures, Azure Advisor as a point to start as well as CCO Dashboard, Cloudockit, AZGovViz, specific partners offerings or the WAF Review reporting (this is not different from AWS).

The Azure approach for the Well Architected Framework provides some changes in the steps to go ahead comparing to AWS. They are more HLD (high level design) to drill down later while Microsoft try to gather more details in order to sort out priorities, responsibilities and tools to address the right technologies to the right issues sooner.

It seems that this workshop process will be run smoothly and easy to use. The truth is, you will get struggle with some Workloads or specific IT components for sure. But what are the most important Microsoft Azure architecture “Quality Inhibitors” to face with?

Cost Optimization –

Operational Excellence –

Performance Efficiency –

Reliability –

Security –

Underused or orphaned resources

No automation or Silos automation

No design for scaling

No support for disaster recovery

No security threat
detection mechanism

As you can see, each hyperscaler has its own vision. But they are similar in the areas to evaluate and to fix when something is not working properly.

In the next post, we will cover more in depth similarities and differences between the two big cloud titans, Azure and AWS. In the meanwhile, the ball is in your court. Read, read and read…for sure you do… 🙂

Enjoy the journey to the cloud with me…see you then in the next post.


After some years migrating workloads from on premise to the cloud, after some years developing cloud first apps, the amount of architectures, technologies and hyperscalers have been expanding their value and support for millions of business and companies… The Well Architected Framework is nothing else that an approach to optimize all those IT solutions from several perspectives.

AWS Approach...

In 2012 AWS created the “Well-Architected” initiative to share with their customers and partners best practices for building in the cloud, and started publishing them in 2015. Now these set of principles are a reality and expanded to many cloud scenarios.

Let say we have some workloads and IT solutions in cloud providers such as AWS or Microsoft Azure with some complexity. Adding to that, we are not sure if the current scenario is designed according the best practices in terms of reliability as the IT service has some small delay responses to the users from time to time. Moreover, when you browse your AWS cost explorer console, this IT Service has a high consumption..

What can we do?, how can we shed some light on this?. OK, AWS provides a set of best practices, principles and strategists to reduce risk and impacts on these areas i´ve mentioned before as well as in other areas. Those areas, indeed, pillars are five:

  • Operational Excellence: The ability to support development and run workloads effectively, gain insight into their operations, and to continuously improve supporting processes and procedures to deliver business value.
  • Security: it´s focused on protect data, systems, and assets to take advantage of cloud technologies to improve your security.
  • Reliability: Enforces the ability of a workload to perform its intended function correctly and consistently when it’s expected to.
  • Performance Efficiency: The ability to use computing resources efficiently to meet system requirements, and to maintain that efficiency as demand changes and technologies evolve.
  • Cost Optimization: The ability to run systems to deliver business value at the lowest price point.

The AWS approach for the Well Architected Framework provides a great value to improve a specific workload or some workloads with some interdependence. To leverage the five pillars potential, the Well-Architected Tool helps you review the digital state of your workloads and compares them to the latest AWS architectural best practices on those areas.

Even, If you want to be more specific and deep dive in a technology or a disruptive solution to identify a clear impact or reduce risk for your workloads, AWS offers AWS Well-Architected Lenses since 2017.

Some examples of Lens which, from my point of view, bring value, are:

Management and Governance Lens – AWS Well-Architected

Hybrid Networking Lens – AWS Well-Architected

SAP Lens – AWS Well-Architected

Financial Services Industry Lens – AWS Well-Architected

Serverless Applications Lens – AWS Well-Architected

In the second part of this post, we will explain the Azure Well Architected Framework. I hope it´s useful to you and it makes your day!.

Enjoy the journey to the cloud with me…see you then in the next post.

Azure Lighthouse, the secret sauce for any Managed Cloud Solution Provider

Managed Cloud Solution Providers (MCSP) are those thirty party companies that help your business to expand and provide muscle and expertise in two ways:

  1. Skill matrix to support – They have a bunch of experts in several disciplines to go through your IT service challenges and digital transformation, they are your mentor to understand your risk and how aligned is your investment in cloud solutions with your business. They have cloud architect and cloud strategist personas in their team to support your journey to the cloud on mostly hybrid scenarios.
  2. Tools to support– They have the right tools to support those business needs and to leverage your current digital state to a new version of your company achieving better efficient in your daily processes, simplifying your employees work, even their quality of life, and for sure, optimizing the time to react to your competitors with innovation. Just to remark, tools means not just thirty party tools but also the native cloud provider tools you have available when consuming cloud services.

Adding to those key points, all the most important operatives to support IT Services on the cloud are based in some specific daily tasks. Monitoring, backup, process automation or security are part of those operatives. Moreover, MCSP need to be effective to solve issues in order to provide the right quality to our customers. Something that it´s called “Operational Excellence” within the “Well Architected Framework”.With the massive expansion of cloud first IT services and migrations to the cloud of a huge amount of IT infrastructure to support data analytics, web services, disaster recovery and legacy applications in the road to be modernized, we need the right tools to cover some clear objetives. Azure Lighthouse has a tremendous maturity to solve lots of aspects and challenges any MCSP need to cope with:

  • Scale as soon as we need to grow. Here i mean scale horizontally. So even when you have to assist lots of customers you can cover their need with granularity and focus on their specific roadmap to the cloud.
  • Segment your IT cloud infrastructure from the customers IT cloud one. So any security issue or IT service downtime that you are providing internally as well as providing to others is limited and it just can affect a customer or group of customers.
  • Provide permissions to some IT resources in the cloud and delegate depending on your customers projects and skills involved access to other partners, to freelance or to sum up to collaborate within this new project with several profiles.
  • Achieve a whole picture of the IT services you are providing to your customers in several Azure contracts and tenants in terms of security posture, alerts with performance or health issues, triage misconfigured Items, provide the right azure governance, etc.

Azure Lighthouse has the potencial and flexibility to include monitoring and traceability to all the customers in several tenants, you get a holistic view, delegate specific permission with a great security level for a period or the time you want to the whole subscriptions or resources groups, integrate all in a Hybrid strategy together with Azure ARC or furthermore integrate security posture and SIEM for several tenants as well. Azure offers top native cloud tools to support your investments in almost any technology tendency.

Let´s go deeper into some nice strategies to any MCSP so they don´t get struggle trying to solve how to translate what they are doing right now on premise compare with Azure.

Access. To access you have as mandatory a secure authentication and authorization strategy., That´s why Microsoft offers the least privilege access approach with Azure Active Directory Privileged Identity Management (Azure AD PIM) to enhance even more access to the customers tenants with just a user or a security group.

Monitoring. Absolutely key for any MSCP. It is the core of your support to your customers. Adding you have to use the right ITSM (Information Technology Service Management) software to be aligned and strive in the right direction to assess and resolve customers issues from high priority to low priority.

Security Posture. Do you know how many misconfigurations and vulnerabilities exist in your customers Azure cloud?. Yes, you can add Azure Security Center to provide the right security posture and know which security controls are affected or can be aligned to your regulatory compliance. We can leverage the Security Score to see in a single pane of glass your customers security posture. Not easy peasy but helps a lot.

Incident Hunting. Maybe you know, maybe you don´t, Azure Sentinel, the Microsoft native SIEM can contribute to consolidate your security threats and deep dive any root cause of a security compromise across several tenants.

It´s a powerful tool to track logs, see layer to layer what´s is happening and determine how to step up suitable hardening for your technologies.

Hybrid Scenarios. Regarding Hybrid scenarios, Azure ARC, can be integrate as well with Azure Lighthouse bringing a great benefit to that holistic overview i mentioned before. The main target in this case, will be to provide the right governance to your customers even if they have some private clouds or on premise infrastructure. Therefore, an exciting approach for those companies which already have a lot of legacy staff to migrate during years but want to explore the benefits of public cloud such as Azure.

To sum up depending on your cloud provider maturity level, there are some key native tools to improve your support on your own or with the help of a MCSP. Azure is one of the most important providers together with AWS to offer this level nowadays.

Enjoy the journey to the cloud with me…see you soon.

7 Rs – Seven roads to take the right decision to move to the cloud

AWS (Amazon) , Azure (Microsoft) and GCP (Google) hyper-scale data centers are increasing their number during the last years in many regions supported by millions of investment in submarine cables to reduce latency. Southern Europe in not an exception. We can just take a look to Italy, Spain and France to realize what it´s happening.

Public cloud providers know many customers will move massively thousand of services in the coming years. The process just started some years ago. But due to the pandemic and the need to provide remote services, to analyse data quicker and with efficiency, the big expansion on sensors to measure almost all in our lives or a global market to beat the competitors in any continent with innovation, accelerates even more.

There are 7 Rs to take the right decision so the CIOs and CTOs know what make sense to move or not to the cloud. What is a priority and moreover the impact and effort to transform their business.

AWS perspective to move IT Services to the cloud

Move to the cloud with a clear perspective on outcomes and goals to achieve will be able to bring value to our customers if you evaluate with care each of your IT services so you can take decisions according with your business alignment. Some Applications could be retire other would enter in a cycle of modernization, other just resize to reduce cost and improve resilience..

Let´s explain our 7 Rs from simple to complex scenarios:

Retire. Some applications are not used any more. Just a couple of users need to do some queries from time to time. Hence maybe it´s better to move that old data to a data warehouse and retire the old application.

Retain. It means literally “do nothing at all”. May be this application use some API or backend from an on premise solution with some compliance limitations. May be it was recently upgraded and you want to amortize the investment for a while.

Repurchase. Here you have the opportunity to change the IT solution. Let´s say you are not happy with your firewall on premise and maybe you think it´s better to change to a different provider with a better firewall adoption for AWS or Azure, even to move from IaaS to SaaS some applications.

Relocate. For example, relocate the ESX hypervisor hosting your database and Web Services to VMware Cloud on AWS / Azure / GPC or move your legacy Citrix server with Windows 2008 R2 to a dedicated host on AWS.

Rehost. It means lift/shift. Move some VMs with clear dependence between them to the cloud just to provide better backup, cheaper replication on several regions and resize their compute consumption to reduce cost.

Replatform. Lift and optimize somehow your application. For instance, when you move your web services from a farm of VMs on Vmware with a HLB (Hardware Load Balancer) on premise to a external LB service on Azure with some APP Services where you can adopt the logic of your business and migrate your PhP or Java application. Therefore you don´t have to worry for Operating system patching or security at the Windows Server level anymore. Even eliminate the Windows operating license.

Refactor. The most complex scenario. You have a big monolithic database with lots of applications using that data, reading and writing heavily. You know, you need to move the applications and the monolithic database and modify its architecture by taking full advantage of cloud-native features to improve performance and scalability as well as to reduce risk. Any failure in a component provoke a general failure. Here you need to decouple your components and move to microservices sooner or later.

I hope you could understand better those strategies to move to the cloud your applications, so you can be laser focus on your needs and achieve the best approach for each of them.

To sum up use the right tools to evaluate your applications/ IT Services on premise and based on the 7Rs choose the suitable journey to the cloud for them..

Don´t forget to leverage all the potential of the CAF (Cloud Adoption Framework) that i´ve mentioned before in my blog together with the 7Rs strategy.

Enjoy the journey to the cloud with me…see you soon.

Containerization to become the RockStar on the stage

CNFC (Cloud Native Computing Foundation) can´t be more clear on their 2020 survey report:

The use of containers in production has increased to 92%, up from 84% last year, and up 300% from our first survey in 2016. Moreover, Kubernetes use in production has increased to 83%, up from 78% last year.

Related to the usage of cloud native tools there are also some clear tendences:
• 82% of respondents use CI/CD pipelines in production.
• 30% of respondents use serverless technologies in production.
• 27% of respondents use a service mesh in production, a 50% increase over last year.
• 55% of respondents use stateful applications in containers in production

What happens when someone adopts containers just for testing in their company?…in less than 2 years the containers are adopted in pre-production and production as well.

Why containerization is so extended?

Here are some facts i figure out.

Devops friendly – Well, there are some reasons, clear as water .. Almost all the big companies within the enterprise segment have a devops CI/CD strategy already they´ve realised that integrating the builds and delivery versions with containers it´s quite agile and effective to compare those software last versions with several libraries as the runtime can be isolated easily and doesn´t depend on a operating system. So to summarize you can have quite quick several pods with containers ready to test two or three versions of your products with their libraries and plugins, packet managers or several artifacts depending on the version and test features, UX, bugs or just performance.. All aligned with your preferred repository solution: Bitbucket, Git, Github, etc.

Multicloud – Another fact and quite solid, it´s Kubernetes run on any cloud, private or public and you can orchestrate clusters with nodes wherever you want, without limitations on storage, compute or locations. Even you have at your disposal a great number of tools to orchestrate containers, not just Kubernetes but also Docker Swarm. To conclude, you can see bellow Docker as simple container runtime which was a tendency in RightScale 2019 survey. Now ,and that´s how technology change from one day to the next, Docker as an underlying runtime is being deprecated in favor of runtimes that use the Container Runtime Interface (CRI) created for Kubernetes. … Anyway, Docker is still a useful tool for building containers.

Cost Savings -You can roll out Microservices on demand and without investing a euro on hardware if you want a pure cloud solution. Just create your pads or simple containers and kill them when you want. Pay as you go, pure OPEX. That means reduce CAPEX on hardware and licenses and forget amortization.

Remove your Legacy applications on your on pace – Also, on one hand, big companies want to reduce legacy applications as they need to eliminate monolithic applications, which use to be very critical, with old versions software and dependences on hardware and licences and poor performance and scalability. On the other hand, they are compromise more than ever with the “Cloud first” principle for new IT services because they need to be global, reduce cost and improve resiliency and many CIOs know that public cloud bring those advantages from scratch.

Security – Least but not less. Containerization reduce the expose surface of your applications, eliminate any operating system bug, and allow to take control on known library vulnerabilities with your Software Quality team and your CISO. Networking is also an area where you can watch out the bad guys as traffic is flowing in and out of the containers and you can configure with granularity what is allowed and what not. Finally, you can monitorice the whole microservices solution with open source tools, cloud providers integrated tools or more veteran thirty party solutions.

In the next post we will see differences and similarities between AKS and EKS.

Enjoy the journey to the cloud with me…see you then in the next post.

Azure Synapse: A new kid on the Block. Empower you company Big Data and Analytics

Some years ago, an investment to analyze data was quite expensive in terms of hardware, networking, knowledge and skills usually external to the organization and obviously data center facilities. Nowadays you can enjoy cloud native data analytics tools that can be deployed in minutes in any region of the world. This cutting edge technologies are evolving to work better together as evolves a music orchestra when musicians and the conductor know each other better. He can give then a splendid performance in the concert. So happens in the cloud, the maturity of the native tools lets you decouple components so that they run and scale independently.

But why Big Data on prem is called to extinction?. Well, it is a matter of being cost-effective in middle-terms. There are some factors that have a great impact on CIOs and CFOs to change their minds:

Big Data on premise is rigid and inelastic as the capacity planning done by the architects to build those solutions is based on picks and needs to take into account the worst cases in performance. They can not scale on demand and if you need more resources you have to wait till they are available even weeks. On the other hand, you have a technical debt if you are underutilize your Big data infrastructure.

Big Data and Data analytics platforms on premise requires a lot skills and knowledge in place from Storage, to networking, from data engineering to data science. It is complex to maintain and upgrade. What is prone to failures and low productivity.

Data and AI&ML live in separate worlds in an on premise infrastructure. Two silos that you need to interconnect. Something that doesn’t happen on the cloud.

Move to the next level. Azure Synapse

Azure Synapse is a whole orchestra prepare to give a splendid performance in the concert. It is the evolution of Azure Data Warehouse as joins enterprise data warehousing with Big Data analytics.

It unifies data ingestion, preparation & transformation of data . So companies can combine and serve enterprise data on-demand for BI and AI/ML. It supports two types of analytics runtimes – SQL and Spark based that can process data in a batch, streaming, and interactive manner. For a Data Science is great because supports a number of languages like SQL, Python, .NET, Java, Scala, and R that are typically used by analytic workloads. You don’t have to worry for escalation, you has a virtually unlimited scale to support analytics workloads.

Deploy Azure Synapse in minutes – Using Azure Quick-Start templates it is possible to deploy your data analytics platform in minutes..choose 201-sql-data-warehouse-transparent-encryption-create to do so synchronize with your Repo on Azure devops and start to configure your deployment strategy.

Ingesting and Processing Data enhacements- Data from several origins can be load to the SQL pool component on Azure Synapse. Let’s say the old data warehouse. To load that data we can use a storage account or even better a data lake storage with the help of polybase, we can use other Azure component called Azure Data factory to bring data from several origins or traditional ones like BCP for those working with SQL. After cleaning the data on staging tables you can proceed to copy to production all that make senses.

A great advantage is that you can now get rich insights on your operational data in near real-time, using Azure Synapse Link. ETL-based systems tend to have higher latency for analyzing your operational data, due to many layers needed to extract, transform and load the operational data. With native integration of Azure Cosmos DB analytical store with Azure Synapse Analytics, you can analyze operational data in near real-time enabling new business scenarios.

Querying Data – You can uses Massive Parallel Processing (MPP) to run queries across petabytes of data quickly. Data Engineers can use the familiar Transact-SQL to query the contents of a data warehouse in Azure Synapse Analytics as well as developers can use Python, Scala and R against the Spark engine. There is also support for .Net and Java.

Moreover now it is possible to query on demand…

Authentication and Security – Azure Synapse Analytics supports both SQL Server authentication as well as Azure Active Directory. Also you can configure a RBAC strategy to access data with less privileged principals.

Finally, even you can implement MFA to protect your data and operational work.

In the next post, i will show you how work other pieces and components of Data Cloud solutions and the great benefits they bring in cost-savings and technical advantages..

See you them…

Agrotech – A new revolution is coming to Europe´s farmlands and crop zones..

Where is the opportunity?

There are large areas of land dedicated to the same crop which means an increased risk to pests and greater efforts in the fight against them. There are many plantations of corn, vines, fruit trees or simply cereals where the soils support a tremendous demand in giving the appropriate results to the farmers and growers. In the case of Spain, southern areas such as Almería produce tons of vegetables, fruits, plants of all kinds with significant water pressure since they are places where there is little rain.

Adding to that. we are facing significant rural depopulation in countries such as Italy, Spain, Portugal, France and further east Europe like Romania, Bulgaria, etc. Older people are left alone in rural settings. Moreover, young people are less and less present in those small villages and medium-sized towns surrounded by a lot of farmland.

We need an urgent answer and it is “Control” and “Automation”. We need efficiency even though we have a small staff to take care of undesirable insects, floods, droughts and fertilizers.

Why public cloud with IoT native tools and Edge computing brings the solution..

On one hand, IoT brings efficiency to the growers and farmers so they know the best moment in the season for sowing, irrigating or harvesting.

On the other hand, provide a forecast to them and a series of historical data to be able to improve their answer in the future.

Finally, you don’t need too much people to take control on vast cereal extensions for example. Even more you can program some tasks to be done automatically following a pattern of conditions.

What Offers the public cloud providers …

This picture (based on Microsoft Azure approach) shows what could be a IoT solution for Agrotech.

  1. Sensors provide data and with the help of Edge nodes which are responsible of data processing, routing and computing operation, reduce latency and provides a first repository for the data to be transmitted to the cloud. Sensor works with lots of several data formats mostly not structured but also some based on tables and well structured.
  2. IoT Hub is in charge of ingest data from the Sensors. It can process data streaming in real-time with security and reliability. It is a managed cloud solution which support bidirectional communication between devices to the cloud or the cloud to the devices. That means that while you receive data from devices, you can also send commands and policies back to those devices, for example, to update properties or invoke device management actions. It can also authenticate access between the IoT device and the IoT hub. It can scale to millions of simultaneously connected devices and millions of events per second and be aligned with your policies in terms of security, monitoring or disaster recovery to another region
  3. CosmosDB it is a globally distributed, multi-model database that can replicate datasets between regions based on customer needs. You can tailor the reads&writes of the data in several partitions at a planet scale even if you want. . This multi-model architecture allows the database engineers to leverage the inherent capabilities of each model such as MongoDB for semi-structured data (JSON files or AVRO can be perfect here), Cassandra for wide columns (for example to store data for products with several properties) or Gremlin for graph databases (for example for data for social network or games).. Hence, it can be deployed using several API models for developers. In our scenario can be use as a way to analyze large operational datasets while minimizing the impact on the performance of mission-critical transactional workloads Besides this powereful database solution, we can use Azure Synapse which is key in the transformation of the data. It is a new Azure component where you are able to ingest, prepare, manage, and serve all the data for immediate BI and machine learning needs more easily. It use Azure Data Warehouse to store historical series of data. Uses Massive Parallel Processing (MPP) to run queries across petabytes of data quickly integrating Spark engine to work with predictive analytical workloads. Azure Synapse Analytics uses the Extract, Loads, and Transform (ELT) approach. Once we have used ML, streaming or batch processing of the data ingested before it´s time to report our information according to the growers or farmers needs.
  4. Presentation Layer. You can visualize the data for example with Power BI integrated with Azure Synapse.

To summarize, IoT market is increasing rapidly. It is expected about 25 billion connected objects worldwide in 2025 following information. There is a major opportunity to transform our society and enhance our agricultural sector.

See you then in the next post…

What happens when the cloud adoption is more than that? you have a cloud strategy based on your IT profile?

We all think that there are 3 possible stages on your journey to the cloud

Those companies, digital starters, looking for advisory to start moving some workloads to the public cloud from their on premise or private cloud infrastructure. Those companies, called digital expanders, with some experience already on the public cloud and satisfied with the outcome of a first cloud adoption on those projects. Finally, those digital leaders, maybe native or not on the public cloud, but with important investment on OPEX and very focus on the business motivations and the outcomes to be cloud first in almost all they do.

But what happens if we change our perception?..if we think there is a conservative IT profile, a moderate IT profile or even an aggressive one in terms on how to leverage cloud native technologies ?

On one hand, another factor to be evaluated it’s not just how to prepare the cloud adoption with methodologies like CAF. But also to understand that not all the companies need a Data Analytics platform or an IoT solution. At least, during the coming years..

On the other hand, how can you reflect those cloud flavours and the cloud native technologies on the real world?..well we can start with this full picture that came to my head sometime ago..

Depending on your IT profile you will be working on some of these cloud flavours

This picture (based on Microsoft Azure approach) try to represent that there are several technologies that we can group by cloud strategy and associate with an IT profile.

My cloud vision based on technologies and cloud strategies

A conservative IT Profile – Would be a company mostly base in traditional infrastructure with storage, backup or archiving as most important priorities as well as some VMs or LOB applications. A sector like banks and finance institutions are well represented here. Actors like hardware providers are still supporting on their on prem and private cloud platforms. Also, they have a big investment on leader hypervisors, complex computing technologies, and use some scalability with containers and autoscale sets but limited for their own resources. User Experience and usability on their APPs is not a strong point and automation on processes with RPA, or use modern Devops platforms is also not very extended on those companies. They have lots of legacy applications and monolithic databases, old data warehouse and traditional ERPs.

Conservative IT Profile

A moderate IT Profile – Would be a company which is more focus on providing an APP or an ecommerce platform with almost no downtime and escalation based on seasonal products. Maybe even they are migrating some specific workloads to bring innovation, to work on a global way with other subsidiaries or to leverage the potential of some disruptive solution like bots to improve the User Experience for their customers. The hardware almost disappear in this kind of companies. They have a hybrid model solution and are starting to embrace the disruption on new cloud native solutions like are cognitive services, machine learning or data analytics. -They are even integrating SaaS technologies like Docusign, use the marketplace to replace some thirty party products that before were present on the previous on premise data center, they had. An example of this profile can be retail companies offering a new online shopping experience, etc.

moderate IT Profile

An aggressive IT Profile – Be cloud first. All they want is working on the public cloud when possible as they learned a lot on the benefits and the outcomes when they CAF and the progressive migration of workloads are well-architected and well defined. They have a tremendous knowledge on leveraging disruptive technologies, save cost, provide the right governance and security and achieve their goals. These companies are very dynamic, use agile methodologies, have clear priorities on accelerate the daily processes, the business and improve the employees and customers experiences. Innovation is their mantra. Here you will see startups like fintech, healthtech,etc. You will see the enterprise vertical on renewable energy companies, insurance or in the chemical and pharmaceutical industry. They use data analytics and Big Data massively, ML, PaaS and Serverless and modern devops platforms. They reduce investment on hardware and licenses as well as integrate SaaS, block chains and other technologies in the daily user experience. Also, they provide APPs and remote work to their users.

Aggressive IT profile

To summarize, this post just try to show that there, outside, adopting the cloud, each company, each public institution have their own hat and they can tailor the technologies to their needs. Finally, not all the companies of each vertical or business are fit with these descriptions but without stereotyping, it is a way of defining types of companies that will soon or later make use of the benefits of the public cloud.

See you then in the next post…