This post was co-authored with Ike Bennion from Visier.
Visier’s mission is based on the belief that people are every organization’s most valuable asset, and optimizing their potential requires a nuanced understanding of employee dynamics.
Paycor is one example of many of the world’s leading enterprise people analytics companies that trust and use the Visier platform to process large amounts of data and generate useful analytics and actionable predictive insights.
Visier’s predictive analytics helps organizations like Providence Healthcare retain key employees within their workforce and identify employee attrition using a framework built on Visier’s attrition risk predictions. and prevention, saving an estimated $6 million.
Trusted sources such as Sapient Insights Group, Gartner, G2, Trust Radius, and RedThread Research praise Visier for its ingenuity, superior user experience, and vendor and customer satisfaction. Today, more than 50,000 organizations in 75 countries use the Visier platform to develop business strategy and drive business performance.
Unleash growth potential by overcoming technology stack barriers
Visier’s analytical and predictive power is what makes its people analytics solutions so valuable. Even users with no data science or analytics experience can generate rigorous, data-backed predictions to answer big questions like time-to-fill for key positions or risk of attrition for key employees. Masu.
At Visier, continuing to innovate our analytical and predictive capabilities has been a priority for our management team. Because these features form one of the cornerstones of what users love about your product.
The challenge for Visier was that their data science technology stack was preventing them from innovating at the rate they wanted. Experimenting with and implementing new analytical and predictive capabilities was costly and time-consuming for several reasons:
- The data science technology stack was closely tied to the overall platform development. Data science teams were unable to independently roll out changes to production. This resulted in fewer and slower iteration cycles for the team.
- The data science technology stack was a collection of solutions from multiple vendors, which increased management and support overhead for the data science team.
Streamline model management and deployment with SageMaker
Amazon SageMaker is a managed machine that provides data scientists and data engineers with familiar concepts and tools to build, train, deploy, manage, and manage the infrastructure needed to achieve available and scalable model inference endpoints. A learning platform. Amazon SageMaker Inference Recommender is an example of a tool that helps data scientists and data engineers become more autonomous and less dependent on external teams by providing guidance on right-sized inference instances.
The existing data science technology stack was one of the many services that made up Visier’s application platform. Visier used the SageMaker platform to build an API-based microservices architecture for analytics and predictive services that is decoupled from the application platform. This gave the data science team the desired autonomy to deploy changes independently and release new updates more frequently.
result
After migrating its analytical and predictive services to SageMaker, the first improvement Visier saw was that the data science team spent less time on deployment and more time on innovation, such as building predictive model validation pipelines. It was something I was able to spend more on. Details and vendor tool integration.
Predictive model validation
The following diagram shows the predictive model validation pipeline.
Using SageMaker, Visier built a predictive model validation pipeline that looks like this:
- Get the training dataset from the production database
- Gather additional validation measures that describe the dataset and specific modifications and enhancements to the dataset.
- Perform multiple cross-validation measurements using different splitting strategies
- Save validation results along with metadata about the execution in a persistent data store.
The validation pipeline enabled the team to make a series of advancements to the model, increasing predictive performance by 30% across the customer base.
Train customer-specific predictive models at scale
Visier develops and manages thousands of customer-specific predictive models for enterprise customers. The second workflow improvement the data science team made was to develop a scalable method for generating all customer-specific predictive models. This allows the team to deliver 10x more models with the same number of resources.
As shown in the image above, the team developed a model training pipeline where model changes are made in a central prediction codebase. This codebase runs separately for each Visier customer and trains (at different points in time) a set of custom models that are sensitive to each customer’s specialized configuration and its data. Visier uses this pattern to scalably drive single-model design innovation to thousands of custom models across its customer base. To ensure state-of-the-art training efficiency for large models, SageMaker provides libraries that support parallel (SageMaker Model Parallel Library) and distributed (SageMaker Distributed Data Parallel Library) model training. For more information about the effectiveness of these libraries, see Distributed Training and Efficient Scaling with Amazon SageMaker Model Parallel and Data Parallel Libraries.
Using the model validation workload shown earlier, you can validate changes made to your predictive model in just 3 hours.
Processing unstructured data
Iterative improvement, scalable deployment, and integration of data science technologies were a great start, but when Visier adopted SageMaker, their goal was to enable innovation that was completely out of reach with their previous technology stack. did.
A unique advantage of Visier is its ability to learn from the collective actions of employees across its customer base. You can eliminate tedious data engineering tasks such as bringing data into your environment and the cost of database infrastructure by storing massive customer-related datasets securely in Amazon Simple Storage Service (Amazon S3) and using Amazon Athena. eliminated by querying the data directly using SQL. Visier used these AWS services to combine related datasets and feed them directly into SageMaker. As a result, we created and released a new prediction product called Community Predictions. Visier’s Community Predictions allows small organizations to create predictions based not only on their own data, but also on data from their entire community. This allows organizations with 100 employees to access forecasts that are only available to companies with thousands of employees.
To learn how to manage and process your own unstructured data, see Managing and Governing Unstructured Data with AWS AI/ML and Analytics Services.
Use Visier data with Amazon SageMaker
Given Visier’s transformative success internally, they wanted to enable their end customers to also benefit from the Amazon SageMaker platform to develop their own AI and machine learning (AI/ML) models.
Visier has created a complete tutorial on how to use Visier Data with Amazon SageMaker and has also built a Python connector that is available in a GitHub repository. The Python connector allows customers to pipe Visier data into their AI/ML projects to better understand the impact their workforce has on their finances, operations, customers, and partners. These results are often imported back into the Visier platform to distribute these insights and drive derived analytics to further improve outcomes across the employee lifecycle.
conclusion
Visier’s success with Amazon SageMaker demonstrates the power and flexibility of this managed machine learning platform. Using SageMaker’s capabilities, Visier increased model output by 10x, accelerated innovation cycles, and opened up new opportunities such as processing unstructured data for its Community Predictions product.
If you want to streamline your machine learning workflows, scale model deployment, and derive insights from your data, explore the possibilities with SageMaker and built-in features like Amazon SageMaker Pipelines.
Get started today by creating an AWS account and accessing the Amazon SageMaker console to connect with your AWS account team to set up experience-based acceleration engagement, unlock the full potential of your data, and create custom-generated AI and ML models. Build. Drive actionable insights and business impact today.
About the author
Kinman Lamb I’m a solution architect at AWS. He is responsible for the health and growth of some of Western Canada’s largest ISV/DNB companies. He is also a member of the AWS Canada Generative AI vTeam and has helped a growing number of Canadian companies successfully launch advanced Generative AI use cases.
Ike Bennion He is Vice President of Platforms and Platform Marketing at Visier, a recognized thought leader at the intersection of people, work, and technology. We have a rich history in implementation, product development, product strategy and go-to-market. He specializes in market intelligence, business strategy, and innovative technologies such as AI and blockchain. Ike is passionate about using data to drive fair and intelligent decision-making. Outside of work, I enjoy dogs, hip hop, and weightlifting.