Be hybrid my friend: Global Azure Vision

After one year of pandemic there is a very clear fact, a majority of enterprises expect to increase cloud usage. On this scenario, there are traditional lift & shift migrations but also many companies choose to paassify applications (strategy to move applications on VMs to cloud multi tenancy managed platforms like Azure AppServices or AWS  Elastic Beanstalk) or even more transform their applications to a containerization (the process of packaging an application along with its required libraries, frameworks, and configuration files together over a containerization engine as Docker )

In this context there are still lots of legacy applications and monolithic workloads that will remain during some years more in the backbone of business logic for a huge number of industries. Not to mention, some compliance or sovereignty policies to retain specific information in local data centers for the company or the country. So the battle in coming years for the cloud providers is go hybrid enough to leverage the cloud for new innovative solutions, for those areas where the competitors can win opportunities in our market, or where we can see the benefits to transform applications to the cloud, such as increase Got-To-market in other regions, improve efficiency, reduce risk, save money and eliminate points of failure.

How is the Microsoft hybrid strategy?

There are several technologies that bring a lot of value to the Azure hybrid scenario. The Mantra here is run what you want where you need it without losing control even if it’s on premise, a private cloud like BT or Telefonica or a different cloud provider like AWS with their compute IaaS solutions on EC2.


Azure Stack Hub

Azure Stack will be your solution if you want to leverage the potential of serverless but using your infrastructure locally as well. You can connect your local data center using Azure Stack Hub.

For example, let’s say you need to maintain on premise a data warehouse due to regulations but you need a HPC (high performance computing) or even MPP (massive parallel processing ) from time to time to perform some calculations with some dataset and you don’t want to invest a lot of money for this stationary estimations. All the outcomes will be store locally once they are prepared and transformed in more accurate data in the cloud. Obviously, the cluster and the slaves nodes will be shut down afterwards.

Azure Stack Edge

Collect data, analyse, transform and filter data at the edge, sending only the data you need to the cloud for further processing or storage. Use ML (Machine Learning) to prepare datasets that you need to upload to the cloud. Azure Stack Edge acts as a cloud storage gateway which transfers to Azure whats is needed, while retaining local access to files. It has local cache capability and bandwidth throttling to limit usage during peak business hours.

Boost your IoT, and Edge computing solutions with this technology. The opportunities to grow here is just your imagination.

There are several models that can work at your edge depending on your needs. Just need to Simply order your appliance from the Azure portal in a hardware-as-a-service model and paid monthly via an Azure subscription.

Azure Stack HCI

It is a new hyperconverged infrastructure (HCI) operating system delivered as an Azure service that provides the latest azure features as well as performance to work with the cloud. Therefore, you can roll out Windows and Linux virtual machines (VMs) in your data centre or at the edge using the previous appliances showed above.

For example, let’s say you want to set up a Disaster Recovery strategy using world-class hyper-converged infrastructure with some Linux LAMP solutions or specific applications with the backend tier running on Azure Stack HCI and the frontend with some Web Services running on Azure. But the data remains on your data center once again if your country or company regulations don´t allow to store it on the cloud.

But the strongest point in this Microsoft Hybrid solution will be to provide an integration with AKS so your applications will be running anywhere from on premise to any azure region. You will be able to deploy containers on the same network, your VNET on Azure together with your VLAN on premise and move, create or kill containers of thousand on applications with their own libraries, run time and piece of software from cluster to cluster empowered with Kubernetes. Can you believe such potential for a global enterprise company?.

Azure Arc

Here comes the key ingredient of the recipe. Azure Arc let users to connect Kubernetes clusters running on-premises or on any other cloud provider with Azure for a unified management experience. Arc provides a single pane of glass operating model to users for all their Kubernetes clusters deployed across multiple locations and platforms. Arc provides capabilities of Azure management to the clusters — even improving the experience with Azure features like Azure Policy, Azure Monitor, and Azure Resource Graph.

In a single pane of glass you can embrace the potential of Azure hybrid model across multiple tenants and subscriptions working together with Azure Lighthouse as well as integrating Azure Stack to roll out your application modernization strategy anywhere, anytime. Can you figure out the tremendous benefits to your customers, users and partners to be there when it’s needed reducing risks as eliminate single point of failure, reduce latency and improve business continuity, better security and governance or increase in an exponential manner your Go-To-Market strategies?.

In the next post, we will compare the hybrid potential that Microsoft offers with another big gigant, AWS.

See you them, take care and stay safe…

Azure Synapse: A new kid on the Block. Empower you company Big Data and Analytics

Some years ago, an investment to analyze data was quite expensive in terms of hardware, networking, knowledge and skills usually external to the organization and obviously data center facilities. Nowadays you can enjoy cloud native data analytics tools that can be deployed in minutes in any region of the world. This cutting edge technologies are evolving to work better together as evolves a music orchestra when musicians and the conductor know each other better. He can give then a splendid performance in the concert. So happens in the cloud, the maturity of the native tools lets you decouple components so that they run and scale independently.

But why Big Data on prem is called to extinction?. Well, it is a matter of being cost-effective in middle-terms. There are some factors that have a great impact on CIOs and CFOs to change their minds:

Big Data on premise is rigid and inelastic as the capacity planning done by the architects to build those solutions is based on picks and needs to take into account the worst cases in performance. They can not scale on demand and if you need more resources you have to wait till they are available even weeks. On the other hand, you have a technical debt if you are underutilize your Big data infrastructure.

Big Data and Data analytics platforms on premise requires a lot skills and knowledge in place from Storage, to networking, from data engineering to data science. It is complex to maintain and upgrade. What is prone to failures and low productivity.

Data and AI&ML live in separate worlds in an on premise infrastructure. Two silos that you need to interconnect. Something that doesn’t happen on the cloud.


Move to the next level. Azure Synapse

Azure Synapse is a whole orchestra prepare to give a splendid performance in the concert. It is the evolution of Azure Data Warehouse as joins enterprise data warehousing with Big Data analytics.

It unifies data ingestion, preparation & transformation of data . So companies can combine and serve enterprise data on-demand for BI and AI/ML. It supports two types of analytics runtimes – SQL and Spark based that can process data in a batch, streaming, and interactive manner. For a Data Science is great because supports a number of languages like SQL, Python, .NET, Java, Scala, and R that are typically used by analytic workloads. You don’t have to worry for escalation, you has a virtually unlimited scale to support analytics workloads.

Deploy Azure Synapse in minutes – Using Azure Quick-Start templates it is possible to deploy your data analytics platform in minutes..choose 201-sql-data-warehouse-transparent-encryption-create to do so synchronize with your Repo on Azure devops and start to configure your deployment strategy.

Ingesting and Processing Data enhacements- Data from several origins can be load to the SQL pool component on Azure Synapse. Let’s say the old data warehouse. To load that data we can use a storage account or even better a data lake storage with the help of polybase, we can use other Azure component called Azure Data factory to bring data from several origins or traditional ones like BCP for those working with SQL. After cleaning the data on staging tables you can proceed to copy to production all that make senses.

A great advantage is that you can now get rich insights on your operational data in near real-time, using Azure Synapse Link. ETL-based systems tend to have higher latency for analyzing your operational data, due to many layers needed to extract, transform and load the operational data. With native integration of Azure Cosmos DB analytical store with Azure Synapse Analytics, you can analyze operational data in near real-time enabling new business scenarios.

Querying Data – You can uses Massive Parallel Processing (MPP) to run queries across petabytes of data quickly. Data Engineers can use the familiar Transact-SQL to query the contents of a data warehouse in Azure Synapse Analytics as well as developers can use Python, Scala and R against the Spark engine. There is also support for .Net and Java.

Moreover now it is possible to query on demand…

Authentication and Security – Azure Synapse Analytics supports both SQL Server authentication as well as Azure Active Directory. Also you can configure a RBAC strategy to access data with less privileged principals.

Finally, even you can implement MFA to protect your data and operational work.

In the next post, i will show you how work other pieces and components of Data Cloud solutions and the great benefits they bring in cost-savings and technical advantages..

See you them…

Agrotech – A new revolution is coming to Europe´s farmlands and crop zones..

Where is the opportunity?

There are large areas of land dedicated to the same crop which means an increased risk to pests and greater efforts in the fight against them. There are many plantations of corn, vines, fruit trees or simply cereals where the soils support a tremendous demand in giving the appropriate results to the farmers and growers. In the case of Spain, southern areas such as Almería produce tons of vegetables, fruits, plants of all kinds with significant water pressure since they are places where there is little rain.

Adding to that. we are facing significant rural depopulation in countries such as Italy, Spain, Portugal, France and further east Europe like Romania, Bulgaria, etc. Older people are left alone in rural settings. Moreover, young people are less and less present in those small villages and medium-sized towns surrounded by a lot of farmland.

We need an urgent answer and it is “Control” and “Automation”. We need efficiency even though we have a small staff to take care of undesirable insects, floods, droughts and fertilizers.

Why public cloud with IoT native tools and Edge computing brings the solution..

On one hand, IoT brings efficiency to the growers and farmers so they know the best moment in the season for sowing, irrigating or harvesting.

On the other hand, provide a forecast to them and a series of historical data to be able to improve their answer in the future.

Finally, you don’t need too much people to take control on vast cereal extensions for example. Even more you can program some tasks to be done automatically following a pattern of conditions.

What Offers the public cloud providers …

This picture (based on Microsoft Azure approach) shows what could be a IoT solution for Agrotech.

  1. Sensors provide data and with the help of Edge nodes which are responsible of data processing, routing and computing operation, reduce latency and provides a first repository for the data to be transmitted to the cloud. Sensor works with lots of several data formats mostly not structured but also some based on tables and well structured.
  2. IoT Hub is in charge of ingest data from the Sensors. It can process data streaming in real-time with security and reliability. It is a managed cloud solution which support bidirectional communication between devices to the cloud or the cloud to the devices. That means that while you receive data from devices, you can also send commands and policies back to those devices, for example, to update properties or invoke device management actions. It can also authenticate access between the IoT device and the IoT hub. It can scale to millions of simultaneously connected devices and millions of events per second https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-scaling and be aligned with your policies in terms of security https://docs.microsoft.com/en-ie/azure/iot-hub/iot-hub-security-x509-get-started, monitoring https://docs.microsoft.com/en-us/azure/iot-hub/monitor-iot-hub or disaster recovery to another region https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-ha-dr.
  3. CosmosDB it is a globally distributed, multi-model database that can replicate datasets between regions based on customer needs. You can tailor the reads&writes of the data in several partitions at a planet scale even if you want. https://docs.microsoft.com/en-us/azure/cosmos-db/introduction . This multi-model architecture allows the database engineers to leverage the inherent capabilities of each model such as MongoDB for semi-structured data (JSON files or AVRO can be perfect here), Cassandra for wide columns (for example to store data for products with several properties) or Gremlin for graph databases (for example for data for social network or games).. Hence, it can be deployed using several API models for developers. In our scenario can be use as a way to analyze large operational datasets while minimizing the impact on the performance of mission-critical transactional workloads https://docs.microsoft.com/es-es/azure/cosmos-db/synapse-link. Besides this powereful database solution, we can use Azure Synapse which is key in the transformation of the data. It is a new Azure component where you are able to ingest, prepare, manage, and serve all the data for immediate BI and machine learning needs more easily. It use Azure Data Warehouse to store historical series of data. https://docs.microsoft.com/en-us/azure/synapse-analytics/overview-what-is Uses Massive Parallel Processing (MPP) to run queries across petabytes of data quickly integrating Spark engine to work with predictive analytical workloads. Azure Synapse Analytics uses the Extract, Loads, and Transform (ELT) approach. Once we have used ML, streaming or batch processing of the data ingested before it´s time to report our information according to the growers or farmers needs.
  4. Presentation Layer. You can visualize the data for example with Power BI integrated with Azure Synapse. https://docs.microsoft.com/en-us/azure/synapse-analytics/get-started-visualize-power-bi

To summarize, IoT market is increasing rapidly. It is expected about 25 billion connected objects worldwide in 2025 following Statista.com information. There is a major opportunity to transform our society and enhance our agricultural sector.

See you then in the next post…

Fast and Furious: Azure Pipelines (2) deploy your Infra and Apps or releases with automation..

Living in a world faster than ever, tools focus on provide infrastructure, applications, mobile apps in an automated way are not important but crucial to survive in a market where companies change their strategies from a week to the next. One region can be a first product market for a company today, but tomorrow it´s a different one.

Devops platforms for the most important providers assumed the principle as native. Azure Devops is focus on CI/CD as many of its competitors but include one secret weapon: flexibility to deploy infra and apps in a question of minutes anywhere, anytime..with reliability and control.

Azure Devops has compatibility with Terraform: https://azure.microsoft.com/en-us/solutions/devops/terraform/ with Ansible: https://docs.microsoft.com/en-us/azure/developer/ansible/overview as a way to provide IaC (infrastructure as code). But also can facilitate its own ARM templates https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/add-template-to-azure-pipelines to create the infrastructure previously needed to deploy the mobile APPs or releases to sell our products in the new target market.

Finally and quite interesting as a Software Company you need to ensure that your code is secure and safe of bugs..so you don´t make the life easier to hackers or governments who are more than happy to know your pillars in the market, your most important technology or science secrets.

To solve such a problem you can integrate in a very flexible manner tools like SonarQube: https://www.sonarqube.org/microsoft-azure-devops-integration/

Be aware that this is real and the last case happened to Solar Winds just some days ago of publishing this post, provoking a big impact in US security: https://www.reuters.com/article/us-global-cyber-usa-dhs-idUSKBN28O2LY

So after this clarification, let me tell you Azure Pipeline can reduce your technical debt (impact you have if you are programming your code a lot of times by choosing an easy and limited workaround causing delay and security holes and performance issues instead of using a better approach from scratch) as well as improve the speed to deliver anywhere, anytime.

So we are going to create a new release and choose an scenario i´ve created previously on Azure where we have just a simple WebAPP on a APP Service plan F1 which is free as it´s just a small lab.

We hit on “create a new release” ..and choose our infra in an Azure Tenant..this is opt to you..and the build we want to deploy over there..

Right now we have prepared our build and the infra where we want to deploy the software or release..

Just to remember, Microsoft provide two type of agent to run the jobs in charge of compile the code or deploy the releases:

Microsoft hosted agents, those run in Azure by Microsoft to support your software projects.

Self-Hosted agents, those that can be installed on a VMs or prem or a different private cloud for example to support the software projects.

We run here a Microsoft Azure agent:

We have the infra ready on Azure as well:

Finally hit on “Deploy”…and the magic happens..

You can see a sample Microsoft Lab website appear on the WebApp quite quick..

With this post we finish a global view to Devops from the Microsoft perspective. A solid respond to current times that solves many of the software life cycle challenges.

Enjoy the journey to the cloud with me…see you then in the next post.

Azure Monitor: a holistic approach to take control on your data

Each native operational cloud tool provide a tremendous value that many people don’t see when they start with the public cloud. Some of them are focus on provide a backup approach, other facilitate assessment or discovery of workloads to be migrate, even security or just watch up specific metrics or KPI. This is the case of Azure monitor, a holistic monitor tool to configure customize dashboards with the most important technologies you are working with daily.

Platform logs provide detailed diagnostic and auditing information for Azure access and use the Activity Log to determine the whatwho, and when for any write operations (PUT, POST, DELETE) taken on the resources in your subscription.

Azure Active Directory logs contains the history of sign-in activity and audit trail of changes made in the Azure Active Directory for a particular tenant.

Resource Logs provide insight into operations that were performed within an Azure resource, for example getting a secret from a Key Vault or making a request to a database. The content of resource logs varies by the Azure service and resource type.

Send the Activity log to a Log Analytics workspace to enable the features of which includes the following:

1. Correlate Activity log data with other monitoring data collected by Azure Monitor.

2. Consolidate log entries from multiple Azure subscriptions and tenants into one location for analysis together.

3. Use log queries to perform complex analysis and gain deep insights on Activity Log entries.

4. Use log alerts with Activity entries allowing for more complex alerting logic.

5. Store Activity log entries for longer than 90 days.

Also great news!, no data ingestion or data retention charge for Activity log data stored in a Log Analytics workspace.


In the next post, we´ll explain how to monitor virtual machines and what is more important, applications and web services..

See you them…

What happens when the cloud adoption is more than that?..do you have a cloud strategy based on your IT profile?

We all think that there are 3 possible stages on your journey to the cloud

Those companies, digital starters, looking for advisory to start moving some workloads to the public cloud from their on premise or private cloud infrastructure. Those companies, called digital expanders, with some experience already on the public cloud and satisfied with the outcome of a first cloud adoption on those projects. Finally, those digital leaders, maybe native or not on the public cloud, but with important investment on OPEX and very focus on the business motivations and the outcomes to be cloud first in almost all they do.

But what happens if we change our perception?..if we think there is a conservative IT profile, a moderate IT profile or even an aggressive one in terms on how to leverage cloud native technologies ?

On one hand, another factor to be evaluated it’s not just how to prepare the cloud adoption with methodologies like CAF. But also to understand that not all the companies need a Data Analytics platform or an IoT solution. At least, during the coming years..

On the other hand, how can you reflect those cloud flavours and the cloud native technologies on the real world?..well we can start with this full picture that came to my head sometime ago..

Depending on your IT profile you will be working on some of these cloud flavours

This picture (based on Microsoft Azure approach) try to represent that there are several technologies that we can group by cloud strategy and associate with an IT profile.

My cloud vision based on technologies and cloud strategies

A conservative IT Profile – Would be a company mostly base in traditional infrastructure with storage, backup or archiving as most important priorities as well as some VMs or LOB applications. A sector like banks and finance institutions are well represented here. Actors like hardware providers are still supporting on their on prem and private cloud platforms. Also, they have a big investment on leader hypervisors, complex computing technologies, and use some scalability with containers and autoscale sets but limited for their own resources. User Experience and usability on their APPs is not a strong point and automation on processes with RPA, or use modern Devops platforms is also not very extended on those companies. They have lots of legacy applications and monolithic databases, old data warehouse and traditional ERPs.

Conservative IT Profile

A moderate IT Profile – Would be a company which is more focus on providing an APP or an ecommerce platform with almost no downtime and escalation based on seasonal products. Maybe even they are migrating some specific workloads to bring innovation, to work on a global way with other subsidiaries or to leverage the potential of some disruptive solution like bots to improve the User Experience for their customers. The hardware almost disappear in this kind of companies. They have a hybrid model solution and are starting to embrace the disruption on new cloud native solutions like are cognitive services, machine learning or data analytics. -They are even integrating SaaS technologies like Docusign, use the marketplace to replace some thirty party products that before were present on the previous on premise data center, they had. An example of this profile can be retail companies offering a new online shopping experience, etc.

moderate IT Profile

An aggressive IT Profile – Be cloud first. All they want is working on the public cloud when possible as they learned a lot on the benefits and the outcomes when they CAF and the progressive migration of workloads are well-architected and well defined. They have a tremendous knowledge on leveraging disruptive technologies, save cost, provide the right governance and security and achieve their goals. These companies are very dynamic, use agile methodologies, have clear priorities on accelerate the daily processes, the business and improve the employees and customers experiences. Innovation is their mantra. Here you will see startups like fintech, healthtech,etc. You will see the enterprise vertical on renewable energy companies, insurance or in the chemical and pharmaceutical industry. They use data analytics and Big Data massively, ML, PaaS and Serverless and modern devops platforms. They reduce investment on hardware and licenses as well as integrate SaaS, block chains and other technologies in the daily user experience. Also, they provide APPs and remote work to their users.

Aggressive IT profile

To summarize, this post just try to show that there, outside, adopting the cloud, each company, each public institution have their own hat and they can tailor the technologies to their needs. Finally, not all the companies of each vertical or business are fit with these descriptions but without stereotyping, it is a way of defining types of companies that will soon or later make use of the benefits of the public cloud.

See you then in the next post…

Your code and your builds from anywhere, anytime – Azure Repos and Azure Pipelines (1)

It’s not magic, but a very versatile tool that can provide all you need to work remotely on you Continuous Integration. If you want a repository with security, SSO and integrated with the best tools to work on your builds, if you want a solution to automate the builds as well as the releases, you are in the right post.

First of all and somehow starting from the end, yes the end, you can choose Git, Github, Bit bucket (Atlassian), GitLab, etc as the origin of your code to run builds. Yes, it is opt to you where you have your code. I wanted to pointed out this to show you how flexible is the solution.

So after this clarification, let me tell you Azure Repos has by default a Git repository. You will use a Gitflow repository approach where you have the master branch on Azure repos and several branches for developers to solve issues, develop new features or fix a bug on a distribute way.

So after a pull request and some specific branch policies that maybe or maybe not you can put in place and the approval of several stakeholders involved in application development, you will merge your code with the master one on the cloud, on your azure repos for this specific project.

On repos you can maintain the code, json file, yaml files, clone, download or do some changes from your Eclipse or Visual Studio client for example.

Furthermore you have the tracking of commits and pushes done in your project code as well as a correlation of all the branches right now active.

On a perfect strategy with your builds you can prepare a new one in a very easy way…just hit on “New Pipeline”

Choose as i´ve told you at the start of this post where is your code..

Let’s say we are going to use a Yaml file. A Yaml file is nothing else than a data-oriented language configuration file where you describe all related to the application that you want to compile. As a run time, programming language and package manager to include some specific libraries..for example.

And finally save and run your build. In this case, it will be roll out manually but you can configure automation and trigger a code compilation after some changes in the code and maybe some approvals from you Project Manager.

So then if you want configure a trigger for the automation…just configure depending on your needs the triggers tab..and that´s all.

In the next post i’ll follow explaining more about azure repositories and azure pipelines so you can see the tremendous mechanism to accelerate the Continuous Deployment for the top developers performers companies.

Enjoy the journey to the cloud with me…see you then in the next post.

Work as a team on Covid times – Azure boards

Developers, Project managers, stakeholders during an application modernization project are the pillars to create a real powerful APP. This in previous times to Covid was still a challenge and now even more. Are you developers disconnected?, are your teams more than teams a silo? .

There is a component quite important within Azure Devops called Boards. This component is part of your solution as you can roll out Scrum or agile approaches quite easily to your developers and stakeholders .

So within your Organization on Azure Devops, click on new project. you can choose between a private one (requires authentication and it´s focus on a team) or public (and open source development for the Linux community for example).

So when i start a project, i can choose the methodology and strategy to follow up my project and foster collaboration?

Azure Boards support several actors: user stories, tasks, bugs, features, and epics. When you create a project, you have the option to choose if you want a basic project with just Epics (the goals that you would like to achieve. Let’s say), Issues (what kind of steps should be follow or milestones) and Tasks (they are included per Issue and means the lists of points you need to execute to get the issue done) or Agile, etc. You can then assign all these items to several people and correlate those efforts on several springs.

So you can create work items to track what happens on the follow up of your development. You can coordinate and motive your team to solve delays, problems and  you would be home and dry.

On one hand, you have all the developers working remotely on this tough times and using SSO as Azure Devops is integrated with your Azure Active Directory.

On the other hand, you can invite as guest other employees or users to access your projects if they are private. Keep in mind that you can even create public projects to open software as i mentioned previously.

Let’s see how a project manager can track an application using a Scrum approach. On the Epic you will establish the goals: some business requirements or maybe an specific enhancement to the Application. To achieve that the team will work on a Feature with a bunch of Tasks to be done. Also all this effort will be tracked on boards.

So in this case of an Agile project you can use an approach like this one where you have a goal or Epic a “New Web Service Version”, you have some user´s stories like project manager or developers and obviously some issues which involves tasks to be done.

For example, create a new CI/CD process with a pipeline where you will deploy the releases on a Web App (with an slots for staging, production or development).

Also, you can see this process including the issues or the tasks associated in each sprint. You will have as much springs as needed to achieve all the goals on the final application release. To show that just check Sprints within Azure boards. Take into account you need to determine with steps or issues/ tasks should be done on each of those springs.

Finally, pay attention to identify the timeline of your springs so the project manager can detect delays and help the team to progress properly.

Adding Azure Boards to a Tab within Teams to foster the collaboration between the stakeholders bring a lot of potential as make very flexible the access, the follow up of the project and checking of every milestone.

On Teams you can add a new tab and choose Azure DevOps…

When selecting the app, you can choose the organization, project and team..

So once you have selected all, you can hit save..

Now as a project manager, you can stay on the same page with a few clicks.

In the next post we’ll show and explain more about azure repositories and azure pipelines so you can see the tremendous mechanism to accelerate the Continuous Integration for the top developers performers companies.

Enjoy the journey to the cloud with me…see you then in the next post.

Azure Devops integrates all in one

The current landscape is full of companies with a little mess in terms of software life cycle. I mean, different repositories, several control version approaches, open source not clearly identify in some cases, several programming languages and packet management and even the CI/CD strategy can change a lot from one application to the next.

What happens here brings delay to deliver releases, developers team miscommunication, fixes for bug not clear, builds are a pain, springs extended for more than expected, poor software quality with scarce testing and in general risk on the code security.

Why Azure Devops?

Azure Devops provide all that is needed in order to cover repositories integration, tools to review the quality of the code, big actors from the open source world like Jenkins a friendly approach for several packages as NuGet, npm and Maven. Even the operational maintenance can be done with Microsoft solution as Azure monitor (Insight is now a component), Azure security center, azure policy, etc. Or even if you want you can use Nagios, splunk or Zabbix.  

If your team works mainly with Eclipse, jenkins, selenium, sonarqube, even with Jira you have a full and flexible integration with Azure Devops.

If your team works with Visual Studio and you have MSDN subscriptions you can get azure devops users subscription for free. It sounds good, isn’t it?

But what benefits in a nutshell can bring Azure Devops to our company?.

  • Timely Access to New Features

Every three weeks, DevOps users receive access to new features.

  • Remote cloud accessible anywhere, anytime, SSO

Users can access anywhere, anytime, without VPN and with SSO and MFA. So with security but adding mobility and flexibility to work remotely.

  • No Upgrades to Worry About

Users need not worry about upgrading or patching up the toolchain because the Azure DevOps is a SaaS product. Companies that run on a CI/CD model no longer need to slow things down for the sake of upgrading.

  • Reliability

Azure DevOps is backed by 24 x7 support and a 99.9% SLA.

  • Flexibility

If your DevOps team doesn’t want or need the full suite of services,they can acquire the specific components needed to fulfill your expectations. Even it´s possible to integrate Open source solutions if requiere as we´ve mentioned and other competitors as well.

  • It’s Platform-agnostic

DevOps is designed to run on any platform (Linux, macOS, and Windows) or language (Android, C/C++, Node.js, Python, Java, PHP, Ruby, .Net, and iOS apps). Woowowww!!

  • It’s Cloud-agnostic

Azure DevOps works with AWS and GCP.

In the next post we will show and explain the components within this solid and strong developers platform.

Enjoy the journey to the cloud with me…see you then in the next post.

Any Company is a Software Company. Why devops matters?

Any company needs software to support its processes, its daily work and the systems which interact with their customers and partners.

Many companies don´t know how to fix the gap on their software needs and have lots of repositories, even more than one version control platform, eternal development cycles with more than a programming language, some build and tests tools, and to increase the risk, using waterfall traditional phases to achieve a final software product release.

In addition, there are in many scenarios some silos developing several software solutions in response to several areas of the same business overlapping efforts, not collaborating with other teams properly and not empowering the developers and stakeholders with a flexible, anywhere-anytime distributed devops cloud approach.

The most important leaders on the market focus on Devops and also providing an approach on CI/CD are Atlassian Jira and Microsoft Azure Devops..

There are some devops products on the market helping or supporting Agile or Lean methodologies to be apply on their software development. Some of them are focus on just collaboration, team work and facilitate user stories, backlogs and work items to Scrum Masters or Key developers. Some also are focus on provide control version, integrate Git-flows strategies or improve testing.

But, to be honest, those ready to provide a CI/CD strategy with their own tools or integrating such solutions as Jenkins, CircleCI or Octopus with efficiency are just these two market leaders from my point of view.

Azure Devops is off-road. It can support SCRUM or CMMI with its component called Azure Boards, build packets with Nuget, NPM or Maven on CI with Azure Pipeline and at the same time deliver the release to Web Services or Containers on CD if needed. It can provide Test Scenarios or use open source such as Selenium or SonarQube to reinforce the software code in terms of quality and security. As everybody knows the source control -source version Microsoft repository bet it´s GITHUB an Git on Azure Repos.

Atlassian Jira can handle Agile while can be integrated with Azure Pipeline as well as their own CI/CD Atlassian Bamboo. As repository git management approach you can use Bitbucket. Even you can use Jira Service desk as their ITSM solution. It´s a veteran in the market and a very solid approach.

So when it makes sense use one or another?. It all depends on what you want to develop, what requirements should be take into account, dead lines, dependencies and if your origin is a legacy monolithic mono repository or a first cloud devops strategy.

Do you need a flexible cloud devops platform with powerful features to work remotely? without loosing security and improving collaboration with partners and customers?. Do you want SSO?. Then the answer is Azure Devops.

In the next post we will figure out what challenges we have to cope with on software development, quality, risks and best effort strategies leveraging Azure Devops to fix all in one.

Enjoy the journey to the cloud with me…see you then in the next post.