post-img

What is Environment as a Service and How is it Impacting DevOps

Posted by Jeff Rezabek August 5, 2019
What is Environment as a Service and How is it Impacting DevOps

Applications are driving business agility and revenue streams. DevOps-focused organizations taking part in this digital transformation are tasked with providing their teams with application environments to support rapid development, testing, and deployment. Unfortunately, setting up and sharing the application environments that your teams need is time-consuming, difficult to troubleshoot, hard to scale, and full of hidden costs. This created the need for Environments as a Service solutions to scale DevOps processes.

Environment as a Service (EaaS) solutions help users define applications together with their infrastructure and data requirements and make them accessible and mobile, so they can be consumed seamlessly by any process. The aim of EaaS is to eliminate the application environment bottleneck and enable faster innovation at scale.

3 Ways to Scale DevOps with Environment as a Service

1. Gain Cloud Choice

In today’s fast-moving market, businesses thrive on their application’s agility. However, the consequences of cloud lock-in and the fear of losing control over the application’s data and infrastructure limits a business’s ability to fully benefit from the cloud, which could impact profits, productivity, and time to market.

Using Environment as a Services solutions, you gain the ability to create one standardized blueprint that can be used to call on any of your available cloud resources, whether it’s AWS, Azure, or Kubernetes, to achieve a multi-cloud approach to your DevOps initiative.

2. Establish Environment Automation

With the rise of agile development and DevOps principles came the new challenge of supplying your development, testing, and production teams with the complex environments that they need to do their job and advance the application. However, the security and level of complexity that was built into your production environment aren’t always carried down to the testing and development environments that have a shorter shelf-life than the production environments. The larger the deviation from the production environment, the greater the chance of exposing vulnerabilities in production.

As the pressure to innovate faster increases, your teams will need more frequent access to production-like environments to develop, test, deploy your applications quickly.

By creating a standard, automated environment using an Environment as a Service tool, you can give your distributed teams the self-service ability to spin-up and decommission the environments on-demand, which can accelerate time to market.

3. Manage Environment Consumption

Cloud providers make it easy to add more cloud resources to help you scale when needed. However, the lack of visibility into resource utilization, cloud spend, and more makes it difficult to manage your environments efficiently. Through the Environment as a Service features like auto-tagging and role-based access control, you can gain the critical insight you need to manage environment consumption, reduce cloud sprawl, and accurately plan for future resource needs.

Want to scale your DevOps initiative 3x? Download our “Buyer’s Guide To Scaling DevOps” to learn more about what it takes to scale DevOps using an EaaS tool.

Download our Buyers Guide to Scale DevOps CTA-Blog

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

Running Pre-Sales Demos on the Public Cloud: Ingredients of a Winning PoC

Posted by Pascal Joly March 22, 2019
Running Pre-Sales Demos on the Public Cloud: Ingredients of a Winning PoC

"Anything that can go wrong will go wrong." Sales Engineers engaging in high stakes demos are familiar with this saying. One of the most stressful moments they experience during customer PoCs happens actually before the demo. Is everything ready? Until recently, preparing the infrastructure for a technical product demo involved reserving and shipping some hardware, connecting these servers or appliances to the network and configuring everything end to end. We're talking weeks or possibly months of planning ahead of the actual "D" day with delays pretty much a given, lost shipment and unresolved IT tickets highly probable.

Then came virtualization and public cloud infrastructure.

Demos on the Public Cloud: a turning point with a few caveats

Microsoft featured appsPublic cloud was a turning point for many sales engineers in the tech world. With unlimited on-demand capacity, Infrastructure as a Service is as simple as doing online shopping: there is no need to be technically advanced to deploy a virtual machine needed for a demo in Azure, AWS or Google Cloud, to name a few. Sales engineers were finally blessed with all the ingredients for a winning PoC.

Not so fast. Complains coming from the various stakeholders involved in the process were quick to emerge:

  • IT admin: "I want to know what's happening in the infrastructure, cloud or not, so I can't relinquish the control to some random SE's credit card account. I should be the one in charge!"
  • CIO, CFO: "I want to understand why my infra spending have been skyrocketing since these sales engineers have been using the public cloud."
  • Sales engineer: "I can't figure out how to build these demos when I have more than one VM involved. takes too much time and I always need help."

Was this line: "just run your demos on public cloud" too good to be true? Seems like we were missing a cheerleader team after all.

Overcoming the Hurdles with Self-Service Environments

demo PoC hurdles

Have you ever dreamt of a self-service platform that let pre-sales engineers dynamically deploy their demo environments on the public cloud, no matter how complex they are, and clean them up after the demo is complete? If only...

It turns out Quali's CloudShell has been designed to natively offers these features on the private or public cloud of your choice. It recently came in the limelight with a case study published by partner Microsoft on a joint customer win (Skybox security). Nothing better than a real customer story to illustrate the point.

CloudShell provided Skybox a few key ingredients to enable their sales team demo their solution on the Azure public cloud effectively:

  • Advanced blueprint modeling with standard building blocks using native Azure Marketplace images or cloud services.
  • Automated deployment of VMs with out-of-the-box network and security configuration.
  • Self-service approach with demos organized in categories for easy access by multiple teams.
  • Simple, visual interaction with the demo environment to access the resources and showcase the value of the solution.
  • Time-bound environments: once the demo is complete the infrastructure is automatically cleaned up: no more ghost VMs, subnets or leftover storage sucking up IT budgets.
  • Enterprise-ready and multi-tenant to scale up as your team grows.

azure demo PoC public cloud environment self-service

Now that we're back on the right track, why stop here? Environment-as-Service have been used in many similar use cases such as a training platform for internal employees, cyber range, marketing team or even for the support team to reproduce bugs.

Before you catch your breath, make sure you download our solution brief on this topic or (no pun intended) schedule a demo!

 

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

5G End to End Infrastructures Deployed Secure and Fast

Posted by german lopez February 21, 2019
5G End to End Infrastructures Deployed Secure and Fast

The promise of tomorrow is here today. Digital Transformation is taking shape at a rapid pace due to the advancements in all facets of technology. One of the key facets is 5G and the use cases that it enables. End-user expectations are high given the new use cases related to gaming, high-bandwidth video and IoT derived interactions. However, 5G End-to-End (E2E) solutions are complex and include multiple technologies as illustrated below.

5G EaaS Lab as a Service Secure and Fast

The networks, applications, devices, services, workflows, and workloads require a level of interoperability that was not required in years past. Network Operators are tasked to automate network slices that deliver guaranteed services to endpoints and applications that are continuously evolving.  Add cybersecurity and privacy regulations to the equation and one can understand why automated, test environments are required to test functionality, security, and performance.

Secure and Fast Solution

In order to address this challenge, Quali has partnered with Accedian and Cavirin to showcase a 5G Environment as a Service (EaaS) ‘Secure and Fast’ solution.  Quali’s CloudShell Pro along with Cavirin’s CyberPosture Intelligence and Accedian’s SkyLIGHT PVX  provides a security and performance score for the 5G E2E infrastructure.  This score provides network operators an understanding of how well they are able to deliver Quality of Experience and Quality of Service to their customers.

CloudShell Pro provides network operators the capability to model, orchestrate and deploy the 5G E2E infrastructure with self-service, on-demand blueprints.  The following environment is deployed in a public cloud environment, Microsoft Azure.  It incorporates a MicroFocus MobileCenter application which reserves and tests smartphone devices within a local lab environment ~ essentially enabling a hybrid cloud model.

CloudShell Pro Environment as a Service Lab as a Service Secure and Fast 5G network infrastructure

Cybersecurity & Compliance Posture

The Secure and Fast service is activated with both Cavirin and Accedian scanning, monitoring and scoring the solution for security and performance metrics.  A variety of cybersecurity and compliance service packs are available for inclusion with the test.  These include regulations such as PCI, HIPAA, GDPR, DISA, NIST etc.  The following example illustrates a CyberPosture score for analysis and remediation.

CyberPosture Intelligence CloudShell Pro Lab as a Service Score 5G Infrastructure Compliance

Infrastructure Visibility

Data, application and network traffic performance is scored by SkyLIGHT PVX.  An aggregate score is determined by combining three data points into an End User Response Time (EURT).  The three data points are:

  • Network             Round Trip Time (RTT)
  • Application        Server Response Time (SRT)
  • Data                    Data Transfer Time (DTT)

The following score highlights all three data points as well as the EURT.  The EURT visibility score provides network operators with a granular view of how their infrastructure is performing.

Accedian SkyLIGHT PVX 5G Network Score

The overall benefit to both enterprises and service providers is substantial given the granular view of the 5G E2E infrastructure security and performance scores.

  • Simplify       5G Network, application & cloud infrastructure services
  • Automate    Expedite deployments and custom configurations
  • Secure          Validate cybersecurity and compliance postures
  • Efficient      Visibility into resource utilization and cost savings

Data analytical tools and the utilization of Artificial Intelligence provide additional insights into the organization's ability to introduce 5G related services.   Together, the combination of Quali, Cavirin, and Accedian, as a Secure and Fast service, accelerates an organizations ability to introduce 5G digital transformation initiatives.

Secure & Fast will be demonstrated at Mobile World Congress Feb 25-29 in Barcelona and during RSA March 4-8 in San Francisco. To book a meeting or to express interest in trialing this new solution, please visit accedian.com/secure-fast.  To schedule a demo of EaaS labs with CloudShell Pro please visit  Quali.com

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

Netflix like approach to DevOps Environment Delivery

Posted by german lopez November 18, 2018
Netflix like approach to DevOps Environment Delivery

Netflix has long been considered a leader in providing content that can be delivered in both physical formats or streamed online as a service.  One of their key differentiators in providing the online streaming service is their dynamic infrastructure.  This infrastructure allows them to maintain streaming services regardless of software updates, maintenance activities or unexpected system challenges.

The challenges they had to address in order to create a dynamic infrastructure required them to develop a pluggable architecture.  This architecture had to support innovations from the developer community and scale to reach new markets.  Multi-region environments had to be supported with sophisticated deployment strategies.  These strategies had to allow for blue/green deployments, release canaries and CI/CD workflows.  The DevOps tools and model that emerged also extended and impacted their core business of content creation, validation, and deployment.

I've described this extended impact as a "Netflix like" approach that can be utilized in enterprise application deployments.  This blog will illustrate how DevOps teams can model the enterprise application environment workflows using a similar approach used by Netflix for content deployment and consumption, i.e. point, select, and view.

 

Step 1:  Workflow Overview

The initial step is to understand the workflow process and dependencies.  In the example below, we have a Netflix workflow whereby a producer develops the scene with a cast of characters.  The scene is incorporated into a completed movie asset within a studio.  Finally, the movie is consumed by the customer on a selected interface.

Similarly, the enterprise workflow consists of a developer creating software with a set of tools.  The developed software is packaged with other dependent code and made available for publication.  Customers consume the software package on their desired interface of choice.

 

 

 Step 2:  Workflow Components

The next step is to align the workflow components.  For the Netflix workflow components, new content is created and tested in environments to validate playback.  Upon successful playback, the content is ready to deploy into the production environment for release.

The enterprise components follow a similar workflow.  The new code is developed and additional tools are utilized to validate functionality within controlled test environments.  Once functionality is validated, the code can be deployed as a new or part of an existing application.

The Netflix workflow toolset referenced in this example is their Spinnaker platform.  The comparison for the enterprise is Quali’s CloudShell Management platform.

 

Step 3:  Workflow Roles

Each management tool requires setting up privileges, privacy and separation of duties for the respective personas and access profiles.  Netflix makes it straightforward with managing profiles and personalizing the permissions based upon the type of experience the customer wishes to engage in.  Similarly, within Quali CloudShell, roles and associated permissions can be established to support DevOps, SecOps and other operational teams.

 

Step 4:  Workflow Assets

The details for the media asset, whether categorization, description or other specific details assist in defining when, where and who can view the content.  The ease of use for the point/click interface makes it easy for a customer to determine what type of experience they want to enjoy.

On the enterprise front, categorization of packaged software as applications can assist in determining the desired functionality.  Resource descriptions, metadata, and resource properties further identify how and which environments can support the selected application.

 

Step 5:  Workflow Environments

Organizations may employ multiple deployment models whereby pre-release tests are required in separated test environments.  Each environment can be streamlined for accessibility, setup, teardown and extending to the eco-system integration partners.  The DevOps CI/CD pipeline approach for software release can become an automated part of the delivery value chain.  The "Netflix like" approach to self-service selection, service start/stop and additional automated features help to make cloud application adoption seamless.  All that's required is an orchestration, modeling and deployment capability to handle the multiple tools, cloud providers and complex workflows.

 

Step 6:  Workflow Management

Bringing it all together requires workflow management to ensure that the edge functionality is seamlessly incorporated into the centralized architecture.  Critical to Netflix success is their ability to manage clusters and deployment scenarios.  Similarly, Quali enables Environments as a Service to support blueprint models and eco-system integrations.  These integrations often take the form of pluggable community shells.

 

Summary

Enterprises that are creating, validating and deploying new cloud applications are looking to simplify and automate their deployment workflows.  The "Netflix like” approach to discover, sort, select, view and consume content can be modeled and utilized by DevOps teams to deliver software in a CI/CD model.  The Quali CloudShell example for Environments as a Service follows that model by making it easy to set up workflows to accomplish application deployments.   For more information, please visit Quali.

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

Quali recognizes Champion at Liberty Global for his Initiative and Innovation

Posted by Pascal Joly October 10, 2018
Quali recognizes Champion at Liberty Global for his Initiative and Innovation

Welcome to the second cohort of the Quali CloudShell Champion program! This program acknowledges some of our most prominent advocates and contributors who are actively promoting automation within their companies.

cloudshell championRocket Scientist category: Someone who is recognized for their technical wizardry and contributed helpful Shells or Plugins to the Quali repository.

Winner: Maarten Juffermans, Liberty Global

Maarten is the lead test automation engineer of the International Test Center Engineering group (ITC) at Liberty Global. His team is in charge of providing the infrastructure to the technology engineering teams to enable them to deliver their products (CPE, Network, VoIP, DTV, Automation etc.). They are based in the Netherlands.

One of Maarten’s primary contribution was to develop all the Shells related to digital media devices used by the business units. That included the support of a large variety of Set Top Boxes (STBs) from different vendors to automate the firmware testing  (60 devices moving up to eventually 600 devices). Maarten has also been using an innovative approach to enable Ansible Automation playbooks for physical components in the CloudShell blueprint. Finally, he also built a framework to sync inventory from their DCIM tool into CloudShell.

When not busy developing Shells, Maarten performs as the lead guitar player in a local Rock band called "Rokkbottom".

Have a champion in mind? Registrations are opened for our round of awards. You can nominate yourself or on behalf of someone else.

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

The Future of Automation

Posted by Tejas Mattur September 26, 2018
The Future of Automation

Guest blog contribution from Quali’s summer intern – Tejas Mattur, of Mission High School

Do the benefits of AI and automation outweigh its costs? This question is a loaded, complex one, with no real simple answer. Like any other complex matter, there are valid arguments on both sides. At the crux of it, there are really only two main arguments regarding this topic. Through this article, I will briefly outline the two stances and the evidence used to back up these stances. Lastly, I will provide my own personal view on the issue at hand.

First, the benefits. The pros of AI and automation technology are quite self-explanatory. For centuries, humans have developed technology to contribute to the advancement of society, and AI and automation are no different. From an economic standpoint, technological advancement and economic growth have always been positively correlated with each other. In fact, American economist Robert Solow ​recently estimated​ that technological change accounted for about 2/3 of growth of the U.S. economy; after allowing for growth in the labor force and capital stock. Even more interestingly, in 1930, the famous economist John Maynard Keynes made ​the prediction​ that his grandchildren would only have to work 15 hours a week, due to the advancements of innovation, machines, and technology. Obviously, Keynes was a more than a bit off with his estimation, but he was right with his logic in that technology has made life easier for humans as a whole. Technology has obvious benefits - it allows business to increase efficiency and cut costs, which in turn helps stimulate the economy. More specific information related to the benefits of AI and IT automation can be found in my previous article, but quite simply put, these tools help businesses function at more productive rates than in the past.

Next, the drawbacks. This is where the majority of research about automation and AI over the last decade has been focused in. The rise of AI and automation has led to a growing fear across the world that human skills will soon become outdated, and that jobs will be able to be easily replaced by robots. A ​2017 study​ by McKinsey Global Institute found that by 2030, as many as 73 million jobs in the U.S could be destroyed. The study also finds that automation projects to have significant effects worldwide, as it states that up to 800 million human workers could be displaced by this timeframe. Jobs such as telemarketers, loan officers, and cashiers have been rated the ​most likely​ to be replaced by automation. Skills once considered useful no longer seem to hold importance in the workplace. However, automation and AI don’t just affect traditional blue-collar jobs; certain IT jobs are at risk too. A ​study​ conducted by the software company Atlassian shows that 87% of workers already think that that AI will alter their jobs in some way by 2020. On the whole, these statistics seem to portray a gloomy view for jobs in the future, as they don’t bode well for many workers.

Overall, the potential negative impact that AI and automation can have on the economy in the future must not be understated. However, the solution to combat the threat of job loss due to this technology is not to run away from it or attempt to regulate it. The benefits that this technology provides are too great to ignore. So, it is clear that jobs with rote roles will inevitably be replaced by automation, but we also must recognize that automation cannot replace all employment opportunities. Jobs that require high proficiency in communication, creativity, and complex problem solving will always continue to have a high demand for human workers, because these are qualities that machines cannot mimic. As an increasing amount of blue-collar jobs are replaced, opportunities at the top of the economic ladder continue to increase. Sectors such as IT will not die out, AI and automation will only reshape them in some manner and cause workers to redefine their roles. In fact, in 2015, the ​number of job postings​ in the U.S reached its highest number ever, at 5.8 million. AI has replaced certain jobs, but the major problem behind its rise is that it has exposed the flaw in the workforce today, which is that the majority of workers do not have the necessary skills to fulfill the job opportunities of today.

We must recognize that increasing the intellectual capital of our workers is the only way to combat automation. The importance of intellectual capital is ​backed​ by some of the most successful people in our world, people like Barack Obama, Bill Gates, and Warren Buffet. These thinkers harp on the importance of reading books and taking time to develop a fundamental level of knowledge, because they recognize that in this day and age, knowledge is the most valuable skill to have. As we move into the future, we must make sure that the next generation of our workforce is able to obtain the necessary education they need to be able to succeed alongside automation, not without it.

Note from Quali:

We were fortunate to have Tejas Mattur, of Mission High School, intern with the marketing department. As part of his internship, he researched the evolution of automation and its applicability to the workforce of the future, as seen from the viewpoint of a high schooler, and made some recommendations. His work is serialized into a three-part blog series published on the Quali website. Thank you to Tejas, and we wish you great things in the future!

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

Automation Today

Posted by Tejas Mattur September 19, 2018
Automation Today

Guest blog contribution from Quali’s summer intern – Tejas Mattur, of Mission High School

Until the late stages of the 20th century, automation technology was utilized solely in industrial environments, to assist with the production of material goods. This would change, thanks to development of computers and the rise of the Internet. The advancements of said technology in turn led to the rapid growth of massive tech firms around the world, all of whom began to utilize information technology (IT) to manage their data and ​improve​ their business efficiency. The expansion of the IT sector allowed for a new type of automation technology to be put into place - IT automation. IT automation is one of, if not the most prevalent forms of automation technology today. Its popularity has even led to the evolution of technology that can go hand in hand with IT automation, such as artificial intelligence (AI) and machine learning (ML). Compared to less than 100 years ago, when we had begun to explore the possibilities of technology such as Ford’s assembly line, we have now entered a world where we’ve began to train robots to exhibit human intelligence, and think as we do. The flexibility of intelligent automation will pave the road to the future.

Photo by Markus Spiske temporausch.com from Pexels

Before getting into the specifics of AI and ML tech in conjunction with IT automation, we must first take a deeper dive into IT automation and its depth itself. Since its introduction, IT automation is one of those terms that means different things to different people, because it includes a diverse array of functions. The official ​definition of IT automation​ is the use of instructions to create a repeated process that replaces an IT professional’s manual work in data centers and cloud deployments. This replaces a series of actions and responses between an administrator and the IT environment. There are a variety of IT automation products sold by traditional IT vendors on the market. Companies like ​Microsoft​ provide automation capabilities through their products such as ​System Center 2016 Orchestration and Service Manager​, as well as ​PowerShell​ DSC. Microsoft sells these sort of products to system administrators and power-users, to allow them to automate the administration of a number of different operating systems. Microsoft’s products are quite broad; there are other automation vendors, such as ​BMC Software​ and SaltStack, who offer more specific products. ​SaltStack ​specifically focuses on DevOps​, offering automation tools that assist with software deployment integrated within an organization’s infrastructure. Aside from administration and DevOps, IT automation also holds value in fields like data analytics and business intelligence. ​Studies ​in recent years show that the largest share of new IT spending is expected to go towards data analytics. The company, Advanced Systems Concepts Inc.​, has a product known as ActiveBatch, and one of ​its many capabilities ​includes IT automation solutions that help with decluttering data complexity, and improving visibility within data sources and dependencies. These use cases are just a few examples of companies that use IT automation within their business platforms, and they show the importance of automation due to its relevance in a variety of different fields.

An important distinction to make when discussing automation is the point that an automated system is different from an intelligent system. An automated system simply follows the commands it has been given by the human who programmed it. For example, an email spam filter is an automated IT mechanism whose goal is to filter out unwanted/junk messages. At times, important emails will end up in the spam folder, and unwanted spam email can also bypass the filter and end up in the normal inbox. The system is not intelligent in the respect that it doesn’t have the ability to recognize and correct its mistakes.

This is where AI comes in. As we progress towards the future, IT automation will also continue to progress, incorporating AI technology. The ​‘new IT’​, intelligent technology, would be able to decrease the importance of human-made automation rules, relying instead on autonomous choices guided by high-level business cost and compliance requirements. The new IT is more commonly known at the moment as ​AIOps​. These tools use predictive analytics and ML to find outcomes, which then trigger automated IT mechanisms. These tools are helpful in most traditional IT functions, such as data storage, analytics, and administration. AIOps products are relatively new, but a good amount of them exist, including Splunk's ​IT Service Intelligence Tool​, BMC's ​TrueSight ​platform, and Cisco's ​Crosswork Situation Manager​. The functionality and adaptability of this technology make it seem as if AI and IT automation will provide nothing but benefits for the future, but when considering the future of automation, we must ask ourselves the question - do the benefits of AI and automation outweigh its costs?

Note from Quali:

We were fortunate to have Tejas Mattur, of Mission High School, intern with the marketing department. As part of his internship, he researched the evolution of automation and its applicability to the workforce of the future, as seen from the viewpoint of a high schooler, and made some recommendations. His work is serialized into a three-part blog series published on the Quali website. Thank you to Tejas, and we wish you great things in the future!

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

The History of Automation

Posted by Tejas Mattur September 12, 2018
The History of Automation

Guest blog contribution from Quali’s summer intern – Tejas Mattur, of Mission High School

Automation. When we hear the word ‘automation’ in this day and age, our minds automatically go to advanced technology such as artificial intelligence (AI), machine learning, and robotics. However, the history of automation technology is much deeper than just these extensions of automation that are used in the workplace today. Automation is ​defined​ as the creation of technology and its application in order to control and monitor the production and delivery of various goods and services. The idea of automation isn’t necessarily a modern one, as the theory behind utilizing automation technology has been around for centuries, although it has become more specific as well as refined to fit certain industries in the last 100 years.

Photo by Luis Gomes from Pexels

The word automation ​traces its earliest roots ​back to the time of the Ancient Greeks, specifically around 762 B.C. The earliest mention of automation technology came in Homer’s The Illiad,in which Homer discusses Hephaestus, the god of fire and craftsmanship. As the story goes, Homer discusses Hephaestus’s workshop, and how Hephaestus had ‘automatons’ working for him, which were essentially self operating robots that assisted him in the process of developing powerful weapons and other items for the Greek gods. Although there is little to no evidence that Hephaestus’s workshop actually existed, this story was written by Homer, a real Greek poet. It shows that the Greeks had at least thought of the idea of using automation technology to solve a problem, which for them was to improve the efficiency of creating weapons and tools.

Picture from Pixabay pixabay.com from Pexels

Throughout history, there is evidence of different groups of people attempting to use automation to solve everyday problems they faced, from miners around the 11th century to workers in the 17th century. However, the time period when automation really began to take off was the ​Industrial Revolution​.The increase in demand for things such as paper and cotton caused a change in the production of these items, with an immense amount of emphasis placed on extreme efficiency and production. In the textile industry, innovations such as the cotton gin became mechanized, powered by steam and water, allowing for greater production yields. In the paper industry, the Fourdrinier was invented, a machine that was able to make continuous sheets of paper, and eventually led to the development of making continuous rolling sheets of iron and other metals. Huge jumps in other fields such as transportation and communication were also made, leading to an increase in even more automation technologies. In fact, a little later on, the term ‘automation’ itself was coined in 1946, due to the rapid rise of the automobile industry and the increased use of automatic devices in manufacturing as well as production. D.S Harder, an engineer who worked for ​Ford Motor Company​,is credited with the origin of the word.

Overall, as the information presented shows, automation has a deep and rich history that spans over the centuries. The main uses of automation prior to the 20th and 21st century have been in industrial fields, and have only more recently been incorporated into the IT world. The drivers of all automation technology, however, have always been similar. With ​industrial automation​,the goal was always clear: to improve the efficiency of manufacturing a variety of items. With ​IT automation​,the goal is to improve efficiency by creating a process that is self-sufficient and replaces an IT worker’s manual labor in data centers and cloud deployments. The parallels are clear, and they show why automation will always be prevalent in society. Developing technology to lessen the burden on human workers, increase business efficiency and make our lives easier in terms of reducing manual labor is something that has been important to us for centuries, and will continue to make its impact in the future.

Note from Quali:

We were fortunate to have Tejas Mattur, of Mission High School, intern with the marketing department. As part of his internship, he researched the evolution of automation and its applicability to the workforce of the future, as seen from the viewpoint of a high schooler and made some recommendations. His work is serialized into a three-part blog series published on the Quali website. Thank you to Tejas, and we wish you great things in the future!

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

DevSecOps Environments Deployed Secure and Fast

Posted by german lopez August 25, 2018
DevSecOps Environments Deployed Secure and Fast

You've just implemented security tools that lower your organization's risk profile for your applications deployed on the Microsoft Azure Public Cloud.  End-user experience is compromised and you're trying to figure out why...sound familiar?  Responsibility rests squarely upon the DevOps, SecOps or DevSecOps teams who modified the application workflow behavior.

So where to start?  You contact the DevOps team to provide you a test environment so that you can start your troubleshooting efforts.  The DevOps team is busy updating the latest software updates and don't have cycles to spare due to production deployment deadlines.  At the same time, you are notified that the Azure Load Balancer is being replaced with Nginx and you have no idea what the ramifications will be on your security posture or end-user experience.

Data Applications Cloud NetworksThe initial troubleshooting activity occurs in the SecOps environment but you still require the involvement of the DevOps team.  DevOps will provide the latest updated software releases.  DevOps teams, responsible for cloud architectural components, re-platform the Azure environment to reflect network modifications.  These tasks are daunting without an ability to access self-service Azure test environments.  In order to address these challenges, test environments are required to isolate troubleshooting activities.  The following example outlines a microservice application deployed in a hybrid Azure cloud utilizing Quali CloudShell for orchestration.

Functionality:  The first step in any application and infrastructure deployment is to ensure that the baseline functions of the application are responsive per the requirements.  CloudShell provides the capability to introduce objects that represent the physical and virtual elements required within the solution architecture.  These objects are modeled and deployed in Azure Public Cloud and within the Azure Stack at the organization's datacenter or remote edge network.

Hybrid Cloud Microservices Application Functionality

Cybersecurity:  Once the functionality of the solution has been validated, the security software components are assessed to determine if they are the cause of the traffic bottlenecks.  In this example, a security scan utilizing Cavirin's Automated Risk Analysis Platform (ARAP) determines the risk posture.  If the risk score violates a regulation or compliance standard, a Polymorphic Binary Scrambling solution from Polyverse is installed to enable a Moving Target Defense.  The DevSecOps team utilizes the blueprint design to update the application software, determine risk posture and remediate as required.

Cavirin cybersecurity

Performance:  So we're feeling good, functionality is in place, security protection is enabled, but end-user experience is terrible!   Visibility is required at the data, application and network layers within the Azure Cloud environment to determine where the bottlenecks exist.  In this example, Accedian PVX  captures, analyzes and reports on the traffic workflows with test traffic from Blazemeter.  Environments are quickly stood up by the DevSecOps team and a root cause is identified for the traffic bottlenecks.

Accedian PVX

Automation:  To bring it all together requires a platform that allows you to mix and match different objects to build your solution architecture.  In addition, Self-service is a key workflow component that allows each team to conduct each operation.  This allows savings in time, resources and costs whereby solution validation can be achieved in minutes and hours rather than days, weeks and months.

Quali CloudShell

In summary, CloudShell automates environment orchestration, modeling, and deployments.  Any combination of public/private/hybrid cloud architectures are supported.  The DevOps team can ensure functionality and collaborate with SecOps to validate the security risk and posture.  This collaboration enables a DevSecOps workflow that ensures performance bottlenecks are addressed with visibility into cloud workloads.  Together this allows the DevSecOps team to deploy environments secure and fast.

To learn more on DevSecOps Environments please visit the Quali resource center to access documents and videos.

 

 

 

CTA Banner

Learn more about Quali

Watch our solutions overview video
post-img

Building a Developer Community from the Ground Up

Posted by Pascal Joly August 24, 2018
Building a Developer Community from the Ground Up

In the Software world, Developer communities have been the de facto standard since the rise of the Open Source movement. Once started as a counter-culture alternative to the commercial dominance of Microsoft, open source spread rapidly way beyond its initial roots. Nowadays, very few questions the motivations to offer an open source option as a valid go to market strategy. Many software vendors have been using this approach in the last few years to acquire customers through the freemium model and eventually generate significant business (Redhat among others).  From a marketing standpoint, a community is as a great vehicle to increase brand visibility and reach to the end users.

The Journey to get a community off the ground can be long and arduous

If in theory, it all sounds great and fun, our journey from concept to reality was long and arduous.

It all starts with a cultural change. While it now seems straight-forward for most software engineers (just like smartphones and ubiquitous wifi are to millennials), changing the mindset from a culture of privacy and secret to one of openness is significant, especially for more mature companies. With roots in the conservative air force, this shift did not happen overnight at Quali.  In fact, it took us about 3 years to get all the pieces off the ground and get the whole company aligned behind this new paradigm. Eventually, what started as a bottom-up, developer-driven initiative, bubbled up to the top and became both a business opportunity and a way to establish a competitive edge.

A startup like Quali can only put so many resources behind the development of custom integrations. As an orchestration solution depending on a stream of up to date content, the team was unable to keep up with the constant stream of customer demand. The only way to scale was to open up our platform to external contributors and standardize through an open source model (TOSCA). Additionally, automation development was shifting to Python-based scripting, away from proprietary, visual-based languages. Picking up on that trend early on, we added a new class of objects (called "Shells") to our product that supported Python natively and became the building blocks of all our content.

Putting together the building blocks

We started exploring existing examples of communities that we could leverage. There is thankfully no shortage of successful software communities in the Cloud and DevOps domain: AWS, Ansible, Puppet, Chef, Docker to name a few. What came across pretty clearly: a developer community isn't just a marketplace where users can download the latest plugins to our platform. Even if it all started with that requirement, we soon realized this would not be nearly enough.

What we really needed was to build a comprehensive "one-stop shopping" experience: a technical forum, training, documentation,  an idea box, and an SDK that would help developers create and publish new integrations. We had bits and pieces of these components mostly available to internal authorized users, and this was an opportunity to open this knowledge to improve access and searchability. This also allowed us to consolidate disjointed experiences and provide a consistent look and feel for all these services. Finally, it was a chance to revisit some existing processes that were not working effectively for us, like our product enhancement requests.

Once we had agreed on the various functions we expected our portal to host,  it was time to select the right platform(s). While there was no vendor covering 100% of our needs, we ended up picking AnswerHub for most of the components such as Knowledge Base Forum, idea box and integrations, and using a more specialized backend for our Quali University training platform. For code repository, GitHub, already the ubiquitous standard with developers, was a no-brainer.

We also worked on making the community content easier to consume for your target developer audience. That included a command line utility that would make it simple to create new integration, "ShellFoundry". Who said developing Automation has to be a complicated and tedious process? With a few commands, this CLI tool can get you started in a few minutes. Behind the scene? a bunch of Tosca based templates covering the 90% of the needs while the developer can customize the remaining 10% to build the desired automation workflow. It also involved product enhancements to make sure this newly developed content would be easily uploaded and managed by our platform.

Driving Adoption

quali developer community integrations plugin shells

Once we got all the pieces in place, it was now time to grow the community beyond the early adopters. It started with educating our sales engineers and customer success with the new capabilities, then communicating it to our existing customer base. they embraced the new experience eagerly, since searching and asking for technical information was so much faster. They also now had visibility through our idea box of all current enhancement requests and could endorse other customer's suggestions to bring up the priority of a given idea. 586 ideas have been submitted so far, all nurtured diligently by our product team.

The first signs of success with our community integrations came when we got technology partners signed up to develop their own integration with our product, using our SDK and publishing these as publicly downloadable content. We now have 49 community plugins and growing. This is an on-going effort raising interesting questions such as vetting the quality of a content submitted through external contributors and the support process behind it.

It's clear we've come a long way over the last 3 years. Where do we go from there? To motivate new participants, our platform offers a badge program that highlights the most active contributors in any given area. For example, you can get the "Bright Idea" badge, if you submitted an idea voted up 5 times. We also created a Champion program to reward active participants in different categories (community builder, rocket scientist...). We invite our customers to nominate their top contributors and once a quarter we select and reward winners who are also featured in an article with a nice spotlight.

What's next? Check out Quali's community, and start contributing!

 

 

CTA Banner

Learn more about Quali

Watch our solutions overview video