Engineering firm CDG, Inc., had relied on manual processes for storing its financial data and project information. To generate monthly reports for analysis, the firm was also manually downloading data to an Excel spreadsheet. These processes were errorprone and of limited use. CDG turned to IT consulting and development company New Era, which helped the firm modernize its data storage and project management and improve its reporting by implementing a Microsoft Azure SQL-driven approach, supplemented by the reporting capabilities of Power BI. Customer Challenges CDG provides engineering, consulting, and development services to public entities, private enterprises, and the petroleum industry. Data is key to its success, but the slow, manual data-entry processes it had been relying on left room for errors. CDG had been using the Deltek Vantagepoint solution for storing the firm’s data and using an OLAP cube to connect the data, but the cube alone wasn’t capable of storing historical data. In addition, the firm also found the overall usability of the data it needed for accurate reporting to be limited, and it couldn’t easily create data visualizations and surface actionable insights. Partner Solution To act on its job-to-date (JTD) reporting with confidence, CDG needed greater accuracy and improved insights in reviewing its accounting data at the end of each month. As the firm’s data was stored in an additional database and OLAP cube, the data is now extracted using an MDX query from the cube to another database using a linked server. The steps to capture the historical and JTD accounting data are then entered in the stored procedure by combining MDX and Microsoft Azure SQL queries. The project uses SQL to gather multiple datasets to organize accounting cycle cashflows into a pipeline reporting dashboard and .pdf in Power BI.
Digital solutions about Data
Challenge This global, nonprofit healthcare organization connects public healthcare programs, including Medicaid, Medicare, and the Marketplace, to underserved and vulnerable populations. Due to the nature of their business, they need accurate, readily available information. The organization began a vital initiative develop more mature business practices that would give them better access to their data and extract valuable insights. To meet this goal, they implemented a data supply chain strategy that gathered data from hundreds of source systems and built a Hadoop data lake to serve their data warehouse and analytical requirements. However, the implementation fell short of meeting their goals as data delivery to the organization and the plan itself were sluggish and failed to meet quality expectations. They still needed a data supply chain delivery model that would accelerate access to high-quality data to their business users, meet regulatory requirements for the healthcare industry, and create a data governance model for alignment with the different departments. They reached out to New Era Technology to assist in creating a data solution that would ensure the quality and timeliness of their data and subsequent analytics. Solution Our first step was to complete a current state assessment that collectively addressed all aspects of their data supply chain strategy. We identified opportunities to address each challenge and provided prioritized recommendations for execution through this assessment. These recommendations included: Establishing a scaled agile program management office Optimizing the delivery, data quality, and metadata processes under an Agile framework Leveraging data integration technologies to maximize delivery Realigning the data architecture We outlined these recommendations in a roadmap that established the program operating model and realigned their architecture to create the foundation for confirmed delivery. After developing a data lake in Hadoop that collects all data from a source to support diverse use cases, we developed an enterprise metadata strategy. In this strategy, all the data is accompanied by the many forms of metadata that better serve business stakeholders and technical users. Looking beyond their initial needs for BI and analytics, the solution also supports more diverse use cases, including PHI, security, and compliance, with proper data quality throughout the data flow.
Challenge A state agency governing child services, programs, and policies wanted to modernize their technology and data platforms to better fit their new goals related to enabling more proactive case management. Case management is highly regulated and must maintain compliance with federally mandated timeframes. To improve their processes and gain added insight to facilitate proactive case management, the agency wanted to take their regularly updated raw data from their operational environment and bring desired analysis to life, resulting in more concrete, valuable insights. As the agency worked to transform some of their core agency functions, they faced concerns about whether their existing data architecture could deliver reporting, analytics, and BI in a modern landscape. To meet their objectives, they decided to use Amazon Web Service (AWS) as their cloud data solution, but they had limited code developer resources skilled on the AWS platform. Their other significant challenge was the significant bottlenecks they faced that were created by process issues and backlogs at different levels of the agency and other departments involved. The agency reached out to Fusion to assist them in delivering a modern data platform that would integrate seamlessly into their ecosystem and build a foundation for the future. Solution First, we performed an assessment of the current data landscape and also considered the processes in place to enter and analyze data. This included: Identifying child support cases within the state that exceeded federal timeframe compliance rules or were approaching the federal deadlines Pinpointing bottlenecks in processing at the agency and caseworker levels that were leading to compliance issues Understanding where and how data was currently being entered and utilized throughout the agency Once we completed the assessment, we created a data model design for their immediate use case (enabling proactive case management) that supported BI analysis of current cases and and trends over time and would integrate different subject areas from their core systems. Our team also provided a modern data reference architecture to guide platform migration within the AWS cloud environment. Our team used standing up native AWS services including: Identity and access management (IAM) Virtual private cloud (VPC) to launch resources in a logically isolated, defined virtual network Lambda, a serverless compute service that runs code in response to events and manages compute resources for the user Glue and Glue Catalog to analyze and categorize data during the extract, transform, and load (ETL) process S3, Aurora Relational Data Service, and Redshift to establish raw, curated, and enriched environments for the data as it is processed from a transactional arrangement of string inputs into a strongly typed and structured format for reporting Our team designed data models and built databases in each of the environments and highlighted additional AWS-specific capabilities for future use cases involving volume growth. By implementing this cloud data solution, the data was brought to life with a series of ideations, visualizing the defined metrics across a host of relevant dimensions, including maps of the state that showed counties with compliance issues and drill-down options that showed cases requiring immediate attention. The data architecture also allowed the agency to consider other BI technologies, such as Tableau, for future use cases and extend the architecture to support data growth and use cases at scale.
Challenge State agencies gather and generate massive amounts of data. A Midwest state had approximately 4 petabytes of data siloed into 1,600 databases within 120 separate agencies, boards, and commissions. Despite the vast amount of data, it was mainly only used for traditional reporting by each individual group. There was no cross-agency sharing or analytics capabilities in place which prevented each agency and the state as a whole from leveraging the data into a more holistic picture. The state's governor recognized the importance of accessing and using the data and tasked an agency to remove the data siloes and open up access to the groups. He maintained that unlocking data was imperative to allow the state to identify and drive meaningful social change and tackle complex problems facing residents' health, security, and well-being. While this is true, there were two significant challenges blocking their ability to meeting this objective. First, were privacy concerns. Many of the state agencies are highly regulated and have very strict privacy rules and regulations in place. Without proper governance, they couldn't share data while remaining in compliance. Their existing technology landscape was a significant obstacle to creating a data architecture that opened up access throughout the state government. Most agencies were saddled with legacy platforms and didn't have the resources to break long-term contracts or shift to a new, modern data platform. In order to meet their objective, the state agency tasked with overseeing this project reached out to New Era to help establish a data sharing platform that would empower the state to maximize their data's potential through analytics. Solution Once we assessed the challenges the state agency was facing, we worked with them to develop a robust data sharing platform that would also protect and secure their data. Our data experts used Cloudera's open source platform distribution and Hadoop's reliable, scalable framework to build a big data platform that allowed the agencies to share resources, tools, and commons services to execute their analytical use cases. To ensure compliance with strict privacy and security regulations, our team developed the platform to include: Defining a data governance program and establishing processes and policies, including data lineage, metadata management, quality assessments, and usage monitoring Developing deidentification capabilities to protect critical data Configuring security services to include authorization, authentication, auditing and monitoring, and encryption To ensure data could be transferred and shared smoothly between the agencies, our team also built in: Defining the framework to support shared capabilities for data ingestion and analytics Developing flexibility to enable agencies to integrate into their internal systems Defining policies that enable agencies to launch and solicit new projects Enabling platform administration services for care and feeding of the platform Defining the framework to allow for shared capabilities for ingestion and analytics, along with the flexibility to enable agencies to integrate uniquely with their internal systems Our team also worked directly in socializing, evangelizing, and consulting with different agencies throughout the state to help them leverage the analytics platform with their use cases. Each agency can operate securely within a flexible and scalable platform that they can customize as their needs change and as they continue to build out use cases and include additional agencies, boards, and commissions.
After an Intensive POC Process, this Oil CoOP Landed on Azure and Snowflake for a Modern Data Platform Challenge This leading Midwest oil refinery cooperative must manage many business processes to successfully deliver fuel and oil, oversee pipelines, and manage retailers. And to do so, they were relying on point solutions from multiple vendors for many of their operational needs. They had data coming in from these disparate solutions but no enterprise data management platform to integrate the information and support accurate business intelligence reporting. But they were not just looking to integrate their data; they wanted to elevate their data landscape through advanced analytics and machine learning to improve their business prospects like price optimization and targets for profitable oil exploration. Their primary goal was to evaluate which platforms would best support their business needs and provide scalability and efficiency for their anticipated use of the platform. We had done previous work for this client, so when the time came to integrate and modernize their data landscape, they reached out for help. Solution Modern Data Platform Evaluation The first step was to assess the client’s current data landscape and state of applications and their demands for data integration based on business needs. We looked at their business requirements for data and functionality, from the key select sources including WolfPack, Guardian, and select unstructured manual data sources, and organized the data into subject areas to serve as a semantic layer for all data available. With an understanding of their current state, we then began evaluating viable solution options for a target modern data platform. Based on feedback from the client, we presented two options with a focus on cloud implementations that support BI and analytics — an Azure-based data architecture and a Snowflake Data Cloud solution. Then we provided the client with a complete assessment deliverable that included our findings of their current state and viable options for the modern data platform, including: An outline of solution options, components, benefits, and impacts An outline of the data architecture Estimated costs for development and deployment Anticipated operational costs Side-by-side comparisons of key features and considerations of the two options We also provided the client with a high-level roadmap of prioritized next steps, which included the creation of POCs to test the business use cases against. Modern Data Platform Proof of Concepts (POCs) Armed with the information from our initial assessment, the client asked us to execute a combination of POCs to demonstrate how these modern Azure-based cloud platforms and technologies would support the company’s BI and analytics needs. To create the POCs, we defined the modern data platform reference architecture within Azure with options for specific technologies, including native Azure components (e.g., Azure Data Factory, Azure Data Lake, Azure SQL Data Warehouse, Azure Analysis Services, PowerBI, and Azure DataBricks), and Snowflake. Building the POCs included: Configuration of Azure subscription and services Design of structures in Azure SQL Data Warehouse Creation of tabular model in Azure Analysis Services Execution of data pipelines into Azure and Snowflake Creation of a Snowflake database & designated views Build out of data pipelines between Azure services and Snowflake Creation of analytics model using Spark with Azure DataBricks Once the platform was created, we demonstrated BI reporting using PowerBI against the POC platforms and the execution of machine learning use cases using Azure ML studio and custom model development. We then tested multiple use cases for data ingestion, BI, and analytics. We validated various use cases for viability, including: Financial reporting using Dynamics AX data that is integrated with budgeting and pricing data Supply chain optimization Predictive maintenance
Challenge Statistics show that in 2031, the U.S. population over the age of 65 will be around 75 million — almost double what it was in 2008. This national retirement services organization recognized that a generation of this size transitioning out of the workforce would cause two significant economic shifts. First, it would instigate a considerable transfer of wealth as balances switched from asset accumulation to asset withdrawal. Second, asset wealth will move to the next generation as wills are executed, and the upcoming generations have different needs, a greater digital focus, and increasing expectations for customer service and engagement. The annuity organization knew that to better serve transitioning and emerging clients, they needed a more integrated approach to accessing and leveraging comprehensive data about their clients, products, and beneficiaries. However, they had several challenges blocking their goals. First, their data assets were scattered across a disintegrated data warehouse, siloed business systems, and disparate reporting assets. They needed a partner to look at their entire data landscape, assess its current state, and create a strategic and actionable plan to meet their new business needs. They were (and continue to be) a well-established, multi-billion dollar organization which isn't a challenge in and of itself. But they had numerous legacy systems and a highly fragmented data asset landscape already in place, and there were long-term managers who weren't ready for change. Additional roadblocks included: A lack of integrated architecture for data across the diverse systems landscape Redundant sources of reporting and analytics data Mistrust of their data resulting from a lack of data integrity and inaccuracies across different platforms Challenges with data analytics due to difficulty accessing applicable integrated data This company worked with Fusion for over a decade on several different initiatives, so when they needed a new modern data platform for financial services, they reached out to us. Solution After collaborating with the company leaders, we understood their goal — to better serve different client populations and prepare for changes in asset management — and how their current challenges were blocking their path forward. With that clarity, our team began a three-phased approach to meeting the objective. Analysis We began by assessing their data and analytics environment. This gave us a foundation for determining gaps and opportunities as well as identify and prioritize business needs and drivers. Program strategy alignment Our team created a data charter that included a full inventory of their strategic business and use cases, an outline of their objectives and program goals, and a list of the identified challenges and barriers to meeting them. We used this to build a cohesive strategy to realign their data management and analytics capabilities with their business objectives, including: Prioritization of strategic business initiatives Key milestone targets for program increments Capability gaps to address Maturity assessment and program roadmap With the assessment and strategy in place, we then worked with our client to understand where they wanted to be on the data maturity scale. From there we could create an integrated, multi-year roadmap to address master data challenges, starting with the highest priority tasks. With a clear strategy in place, our data experts designed and built a modern data platform for the financial service company. The new platform would support data science, management reporting, and executive dashboards with integrated data that was accurate and reliable.
Can a bracelet really save your life? See how wearable devices resulted in improved care quality and patient outcomes for some of our most vulnerable populations. Challenge Nursing homes have consistently struggled with staffing ratios, and it has only worsened with the pandemic. With a caregiver shortage, nursing home administrators are having to find new ways to keep their patients — a vulnerable, elderly population — safe. That’s where BioLink Systems came in. Nursing homes need to constantly monitor patient vitals to ensure they are providing the appropriate care and can intervene as quickly as possible if there is an issue. But that can often be difficult with large patient loads and patients that sometimes cannot communicate their needs to staff. And with an elderly population, things can spiral out of control quickly if they aren’t caught in time. By constantly monitoring a patient’s vitals, nursing staff can address concerns quickly and mitigate potentially catastrophic events with the patient. BioLink Systems had initially created a device that could attach to an adult brief to monitor urination levels and the patient’s body position. However, they quickly began experiencing issues with their prototype and realized that although it demonstrated their capabilities, it was not ready to take to production. They needed to fix these issues quickly, so they reached out to Fusion Alliance for assistance. Solution The first step was to fix the proof of concept (POC) to create a new demo to get the necessary funding for the project. However, even after the initial fixes, BioLink realized they needed to start over. The POC had to be rebuilt including the software, like web pages and portals, but also the hardware and firmware. Our solution was to architect a full-blown IoT solution from scratch. With the initial POC, all of the data was on-premise. But with the new architecture, we moved everything to the cloud. We were able to ensure each patient had a unique identifier and that all data was encrypted, safe, and HIPAA compliant. In addition to re-creating their initial adult brief wearable, we created a wearable device to monitor patient vitals, including body temperature, oxygen levels, and heart rate. Accurate Data in Real-time Once we had a solution for collecting patient data by wearable devices, we needed to ensure that it was transmitted to the right people at the right time. To do so, we created smart hubs that were spread around the facility to upload data to the cloud in addition to web portals and a mobile application for caregiving staff. The web portals allow caregiving staff to check-in at nursing stations and in offices, and the mobile application is something they can carry around to check on a patient immediately and provide real-time feedback when they are alerted to a patient concern. These executions allow data to be transferred from the cloud to the portals and mobile apps where caregiving staff are notified with an alert of any patient that requires follow-up care. Improving Patient Care & Outcomes Each patient device is outfitted with an NFC chip so once the nursing staff is alerted, they cannot dismiss the alert until they scan the chip on the wearable device and complete the appropriate assessment. If the alert is not dismissed in a timely manner, as identified by the nursing home administration, the alert is escalated to a new caregiver. This ensures that no alert is missed because a caregiver ise attending to another patient or busy, and interventions are completed in a timely manner. Personalized Monitoring & Care Powered by Machine Learning Now that we have this data from patients, we can use machine learning to gain insights about a patient’s care and needs. Initially, the patient care staff will set thresholds for appropriate vital levels, but as more data is collected, the system will learn and be able to figure out what the normal levels are for that patient specifically and alert accordingly. Data Security The POC device was acting as a Bluetooth beacon, so as we evolved and proposed a new solution, security was at the center of the entire project. We needed to ensure that the data transmission was completely secure and HIPAA compliant, but also that the wearable device had significant battery life so that data was accurate and consistent. To ensure a long battery life, we created a wearable and smart hub that doesn’t have any screens and only communicates via Bluetooth. You simply have to put the wearable device into pairing mode and it exchanges the encryption key through the cloud. Then, the encryption key is sent to the smart hub, and the data is all encrypted through the device and decrypted once it is in the cloud. In addition, each wearable device has its own dedicated encryption key so if one wearable is compromised, BioLink can simply address the issue that individual device without other wearables and data being compromised.
Challenge Our client, a large-scale automotive components manufacturer, implemented a real-time health feedback solution for vehicles using their components. The system used IoT engine and sub-system data points that would notify the user of issues with the vehicles' components in real-time along with recommendations for correcting those issues. This system was built in Azure using Event Hub functionality and custom development. While the solution was successful, it was also limited to only reactively responding to warnings and errors as they occurred. The company needed an analytics solution to create functionality for developing predictive models that provide meaningful proactive recommendations to the fleets. To help them build a successful machine learning project, they chose to collaborate with New Era Technology. Solution To support the high-level analysis and machine learning required to build predictive models, the company needed a large amount of data collected over an extended time frame. Since the current system only retained data long enough to provide real-time responses, New Era needed to build a second big data solution to capture and retain the IoT data and integrate manufacturing data that our client would use in the analytics process. To complete this objective, we designed and implemented a data warehouse to collect and organize the data and provide a data platform for high-end analytics and machine learning. In addition to creating this solution, we also worked alongside the client and: Developed an Azure and cloud solution architecture Identified recommended technologies and planned for implementation Designed and developed an integration model to ingest enterprise source data Built the Azure data pipelines to process the near real-time data from devices Integrated data into a curated data model for BI & analytics Demonstrated the BI capabilities to prove the business value of the data Ultimately, we built a modern data platform that uses the existing Event Hub functionality to output the data, Azure Data Factory to manage the data flow, and Snowflake Data Warehouse to store, organize, and integrate the data.
Challenge Most organizations have a large volume of data available, including information about customers, products, services, and more. All this data can be incredible useful to companies by giving them insight into how they can improve customer experiences or improve operational efficiency. However, without the right strategy and technology in place, all of that data may just sit there like a digital paperweight, providing little valuable insight for companies to act on. This was exactly where a large real estate investment firm found themselves. They wanted to gain more insight and knowledge from their data but were burdened by extensive manual data entry, data that was siloed in multiple locations, and delays in reporting and data. In addition to these already large challenges, their on-premise infrastructure was nearing data capacity. They knew they needed to take action and leaders were ready to shift their data to the cloud. Ultimately, their goal was simple: enable corporate agility by implementing a platform that would ultimately allow them to gain valuable, actionable insights through data and analytics. Now they just needed the right tools and expertise to get there, and they turned to Fusion for assistance. Solution Cloud Solution We started with an assessment of their existing system and data sources to determine what type of data platform would provide company stakeholders with better insights into their data. Our client was looking at different cloud options and wanted to ensure their platform decision would align with their existing systems while helping to improve their current processes. More than just a new implementation, they also needed the right data strategy in place that would allow them to meet new business goals. Based on interviews with stakeholders and the results of the platform evaluation, we ultimately recommended a hybrid cloud model that would work with their existing infrastructure, utilizing Azure as a cloud solution and the Snowflake data platform. We created a roadmap that prioritized options for building out their data and technology strategy and a plan for getting from the current state to the updated future state. Implementation & Proof of Concept Armed with the recommendation and roadmap, this investment firm was ready for implementation. Our data and analytics team worked with them to: Build out the new data architecture, data model, and ingest processes Identify and integrate new data sources into their analytics environment Create and refine data governance processes and information This process gave them the means to store and manage data effectively, and from there, our team helped them develop and implement dashboards and analytics reporting that would give them the insights they needed to make informed business decisions. Given its analytics and visualization capabilities, automation functions, and ease of customization, we knew that Power BI was the right tool for this client’s analytics needs. Continued property insights Because this real estate investment firm focuses on shopping malls, data from their shoppers was critical. They wanted to track how well their properties were engaging with customers via different social media channels. We worked with their marketing department to build out a dashboard pulling data from Facebook, Instagram, Chatmeter, marketing email interactions, and other social media platforms to give them the detailed look they wanted.
Challenge The ability to analyze valuable, relevant data quickly offers merchants, retailers, and service providers new insight into consumer behavior and opportunities for personalized experiences. A Fortune 500 card services provider served 135 individual private-label credit card merchants and needed a way to optimize how it retrieves and analyzes data for them so they could provide customized loyalty marketing services. However, leveraging their data for loyalty marketing was a challenge they didn't have the tools to meet. Their existing data management and analytics platform supported marketing and financial analytics, but they recognized a need to optimize these systems to align with their expanding opportunities for growth. The card services provider turned to Fusion Alliance for help in developing a strategic data management and analytics strategy as well as a roadmap to guide their optimization journey. Solution When we started the collaboration with the card services provider, their partners included 135 different merchants with private-label credit cards. For each merchant, the company provided reports and analytics based on specific, unique requirements and needed a data management strategy for their current resources while maintaining the level of customization their clients were used to while growing their brand-partner portfolio. To better understand their specific challenges and determine and implement the best possible solution, we completed a comprehensive evaluation to determine current business needs, requirements, and market opportunities. We also assessed the technical landscape and reviewed the client’s current strategic plan and collaborated with them to create a more effective data management and analytics strategy with a multi-year roadmap for implementation. With the full support of our client’s executive committee, Fusion began solution delivery, starting with creating an Enterprise Data Governance Council to oversee the data management and analytics Program. Fusion applied our comprehensive, proprietary Catalyst Strategic Data Management framework to accelerate the process which allowed us to complete the strategy and roadmap on a fast-paced schedule. This ensured our client could more effectively guide development efforts needed to deliver analytic enablement in support of business objectives. After working with Fusion Alliance, the client gained a roadmap that outlined areas for improvement and strategies that would enable the company to offer more individualized services, including loyalty management to merchants while reaching additional clients. The end result was a strategic data management and analytics strategy that would build out a better future.
Challenge A Fortune 500 company that provided credit card services and loyalty and marketing solutions to over 100 merchants and retailers wanted to design a virtual workspace and improve internal communications for its 8,000 employees using an intranet site. The company envisioned a user-friendly intranet site that would support efficient and secure workflows and facilitate collaboration, but the resulting site fell short of their vision. While intranet sites should help companies share information and improve communications, the reality is that many sites don't deliver the needed results and become a disorganized mess and a navigational nightmare due to a lack of governance. The card services company's employees found their site difficult to navigate while administrative tasks were sluggish, leading to declining productivity. Having invested in an intranet that was not sufficiently meeting their employees' workload demands, the organization asked New Era Technology Alliance to build a site that was reliable, secure, and functional for the company. Solution The client wanted to build three different site templates, one for teams, projects, and pages. New Era Technology looked at their clients objectives, and the challenges that were standing in the way of meeting the objective, and stepped into the picture with a plan in place. First, our team inventoried content in the existing SharePoint site to distinguish current usage patterns and ensure the new templates would accommodate these patterns. We also conducted numerous stakeholder interviews to identify key content items necessary for the redesign. Using Agile and a proprietary project-management methodology called SureSolve, our team designed a new site for each of the templates and created a branded master page consistent with the organization’s style standards. Each site was specifically created to adapt, scale, and evolve with the changing demands of the employees, business, and industry. Finally, we designed and launched multiple elements to streamline workflows and facilitate efficient document and project management, including: My Documents Web Part Governance plan Content guide Comprehensive site map Information architecture We implemented the SharePoint intranet development following Microsoft best practices, taking care to ensure that those customizations didn't hamper the client's ability to upgrade SharePoint or move to SharePoint Online in the future.
Challenge During the summer of 2020, government leaders needed to continually evaluate the economic impact the COVID-19 pandemic had on their communities. Having accurate, timely data was essential to making strategic decisions about how to help businesses, determine what businesses could open and which should stay remote, and how long shutdown protocols should remain in effect. Unfortunately, they didn't have access to the data they needed. Illinois-based economic development services firm, Blane Canada, LTD., partnered with the volunteer grassroots BR|E (business retention/expansion) COVID-19 Response Network to create a way to quickly provide data to government leaders. They created a benchmark survey and follow-up questionnaire consisting of carefully selected questions related to the workforce, finances, supply chain, and future needs and used the responses to measure the level and severity of the economic impact and learn the needs of businesses. While they had a large volume of relevant, valuable data, they needed a data dashboard that would allow them to analyze and distribute data and share it freely to the public on a large scale. The group needed a technology partner who would take time to understand the problem, ask the right questions, and build a technology solution against a tight timeline. Eric Canada, CEO of Blane Canada, was confident that Fusion Alliance, his company’s technology partner of two years, would be the right fit and asked if our team could build a data dashboard for economic developers to learn the impact of the virus on their business communities. Within two weeks, we had a solution and dashboards up and running, available to the public. Solution Our team was aware that a solution needed to be launched quickly. We started by evaluating survey platforms and conducting proof-of-concept testing to see which platforms met all the requirements, and then selected the best option for our data dashboard. With a platform in place, we could determine how to standardize and unify data collection to improve returns and gain more quantifiable data. Our team built the survey, sent it out and also shared it with other entities so they could also send out the survey to businesses in their area. This all took place within five days of Blane Canada, ltd. reaching out to Fusion. From there, we turned our focus to displaying the results. We recommended a data dashboard that was similar to the Johns Hopkins coronavirus dashboard, and our client agreed. A dashboard would allow users to see the story in a visual format and interact with the data. Working against the clock, we built the dashboard and demoed it a few days later to over 100 organizations in the growing grass-roots volunteer network. Two days after the survey was sent, they began receiving data which was then aggregated and put into the analytics toolset. It worked exactly as it should, and the group continues to send surveys and follow-up monitoring questionnaires nationwide, updating the data dashboard as more results come in. Many companies submitted four separate monitoring surveys, providing added data points that government entities and grassroots organizations could use.
Evolving demands require evolving strategies. Learn about how one company transformed their customer experience by focusing on data. Challenge The ability to analyze quality data in a timely manner allows loyalty marketing companies to create strategic campaigns targeting new and existing cardholders. And as new technology continues to drive the market across industries, it is imperative that loyalty marketing companies implement data-driven solutions to manage the increasing volume of client demands. This Fortune 500 card services provider recognized the need to provide customized loyalty marketing services to its 135 private-label credit card merchants, and realized that the only way to do that was to optimize how they retrieve and analyze data. Their existing data management and analytics platform supported marketing and financial analytics, but they needed to optimize these systems to align with its expanding business vision and strategy. Ultimately, they wanted to: Create a strategic data management program to drive data and analytics maturity Optimize teams to enable delivery of customized client solutions more efficiently Establish greater oversight of goals and results from investing in a strategic data management program They knew where they wanted to go, but they needed help with how to get there. Looking for help in developing a data management and analytics strategy, they reached out to us for help. The organization turned to Fusion for help in developing a strategic data management and analytics strategy and a roadmap to guide the optimization journey. Solution At the time we got involved, the client’s brand partners included 135 different merchants with private-label credit cards. For each merchant, the company provided reports and analytics based on a unique set of requirements. They saw that they needed a strategy to better manage the volume of reports with their current resources and still maintain the level of customization clients were accustomed to, especially as the brand-partner portfolio continued to grow. We completed a comprehensive evaluation to determine current business needs, requirements, and market opportunities. Then, we assessed the technical landscape and reviewed the client’s current strategic plan. The result was the creation of a data management and analytics strategy and a multi-year roadmap to help them achieve their goals. With the full support of the client’s executive committee, we began solution delivery, starting with the creation of an Enterprise Data Governance Council, appointed to oversee the Data Management and Analytics Program. The process was accelerated by applying Fusion’s comprehensive Catalyst Strategic Data Management framework. Through this approach, we completed the data management and analytics strategy and roadmap on a fast-paced schedule, allowing the client to more effectively guide development efforts needed to deliver analytic enablement in support of business objectives.
Challenge Schools provide help and resources to students who they know are suffering from stress, abuse, and mental health concerns, but a lot of children's issues go undetected. To prevent children from falling through the cracks and missing out on needed support, Terrace Metrics was founded in response to schools wanting to understand the behavioral-health status of their students, identify at-risk kids early on, and intervene to change the course of their lives. Utilizing proprietary algorithms, Terrace Metrics created a behavioral health tool that allowed schools to assess the mental health and well-being of their students in a non-threatening, non-invasive way. that established a comprehensive, non-threatening way to assess the mental well-being of students. Schools adopted this tool rapidly, and Terrace Metrics realized that the tool's reliance on multiple third-party tools and manual steps prevented it from scaling with the pace of demand. While their main challenge was an inability to scale with demand, the startup tech solution also needed to address the following needs: Automate manual processes Implement strong security features Automate report generation and distribution Provide schools with access to administrative tools to manage information Create scalability to handle increased usage and thousands of simultaneous assessment submissions Allow the addition of new features such as text to speech Ensure a low operating cost They reached out to New Era Technology for design and technology expertise they could use to reimagine their behavioral health tool and expand nationally and internationally. Solution Creating a Roadmap to Success We worked with Terrace Metrics to understand their processes, goals, long-term strategy, and market demand. It became clear that they needed a comprehensive technology solution that included: Creating a code repository with continuous integration and continuous deployment (CI/CD) capabilities Development and testing capabilities Production, monitoring, and support Updating the user interface (UI) to focus more on user experience (UX) and ensure an easy, intuitive function for students, parents, teachers, and administrators We built a comprehensive strategy to meet these objectives and revamp this startup tech solution. Cloud Application First, our team developed a cloud-based web application using Amazon Web Services (AWS) cloud. This eco-system offers the tools and services needed to overcome current challenges and future functionality, including: Cognito for secure user authentication Lambda for managed serverless compute DynamoDB for scalable managed database Polly for text-to-speech. We designed the application specifically as a multi-tenant SaaS application with event-driven design and serverless architecture. This ensured faster time to market, lower costs, improved scalability, and greater resiliency. With no infrastructure administration and the ability to add more functions in the future, such as machine learning, we knew this would be the right option their product. Improved User Experience This tool needed to be accessible to a variety of populations. We addressed UX for students, parents, administrators, and mental health professionals by offering several features, including text-to-speech and translating the tool into other languages beyond English. With an extensible platform on AWS designed to accommodate changes and updates, we could continue assisting them in their expansion to international markets. We’ve worked with several software development companies in the past. None compare with New Era Technology in terms of quality of product, customer service, and willingness to go above and beyond to ensure that our needs are met. We love New Era Technology and are so thankful that they are our partner. Rich Gilman, President, Terrace Metrics
Life-Saving Technology for Long-Term Care: How Digital Experience in Healthcare Tackled Problems for Vulnerable Patients
Challenge While it's common for patients with diabetes living in long-term care facilities to have standing orders for blood glucose monitoring every few hours, it's also common that the physician in charge may not review the results before a new test is conducted. When this occurs, insurance companies don't reimburse the care facility for the cost of the test. More importantly, not reviewing tests regularly meant that healthcare providers were missing diabetic episodes, resulting in adverse outcomes, including health complications or even deaths. When a healthcare technology solutions company realized the extent of this problem, they sought to develop a patient-event notification system that would automatically upload the standard diabetes test results and securely send them to the provider. If the provider did not respond within a set amount of time, the system would automatically follow an escalation protocol, sending the results to the next level provider, etc. The process would continue to escalate until the issue was resolved. If the results revealed a diabetic event that needed immediate attention, the provider could create the necessary care plan to minimize negative outcomes. They also wanted to build in an option within the system that a company could create a notification system that worked specifically with its own diabetes-care tools. This would strengthen customer loyalty to its specific product and brand. With a plan in mind, they reached out to Fusion Alliance to develop and launch this game-changing application. The greatest challenge of incorporating a digital experience in healthcare would be building this system within the security and privacy regulations. However, our team has in-depth experience working in regulated industries as well as a broadly integrated technical proficiency, a clear understanding of security and usability issues, and the ability to meaningfully engage and wrap solutions around business processes. Solution Our Fusion team knew this system would have to navigate complex relationships with patients and providers while complying with both industry regulations and patient confidentiality standards. We worked directly with the client to develop, pilot, and grow the diabetic-event notification system, taking care to keep patient privacy and HIPAA laws as a high priority. To do so, we created a high-security data center to ensure strong security throughout the portal. We also prioritized the user experience and functionality of the site because our team knew that if providers didn't find it usable, they wouldn't embrace the technology. We implemented a web-based caregiver interface with features that would support unified messaging so providers could receive messaging in a way that best suited them. The pilot program successfully determined that the new event-notification system would substantially assist in improving patient care, patient satisfaction, and quality of life, while increasing reimbursement rates and lowering operating costs for long-term care facilities. The company earned immediate success and brand loyalty due to its absolute focus on the customer.
Challenge According to the CDC/NCHS National Vital Statistics, banks lose about 10 percent of their account deposits due to customers closing their account. While half of that loss is attributed to factors out of the bank's control, such as death, divorce, or displacement, the other half of loss is attributed to the customer's dissatisfaction with the bank's fees, rates, products, or lack of convenience. For a bank that handles nearly $700 million in deposits annually, losing $70 million due to customer loss will seriously curb their ability to grow. A regional bank wanted to reduce customer attrition and learn how to predict which customers were likely to close their checking accounts within 90 days. This would give them the opportunity to take action to retain the customer. Their traditional analytics tools didn't have the capacity to uncover patterns, and human analysis certainly won't work to analyze billions of data points in real time. The bank knew they could use machine learning to analyze their existing data to reveal trends and insights and predict future behaviors and outcomes, but they hadn't used it before. For expert help with this project, they turned to New Era Technology. Our team had worked with them several times in the past and knew how to help them get the insight they needed. Solution Most machine learning projects take several months before seeing results. However, our client wanted to see a proof of concept quickly to determine how this would work and whether it would help them fulfill their goals. With a New Era Technology Machine Learning Jumpstart, we could help them meet this objective. Machine Learning Jumpstart During our Machine Learning Jumpstart, we assess the key business problem, assess the data, and build a machine learning model that delivers answers fast. To do this, we follow a specific, four-step plan that includes: Use case identification. Prior to beginning our technical work, we explored several use cases in a workshop with the bank’s business and technical stakeholders. Our team rated each potential use cases on different criteria, including the complexity, availability, and value impact of the data. We agreed upon the deposit attrition proof of concept, deciding it would drive maximum predictive value with minimal risk. Data processing. We inventoried and sourced the existing data, then cleaned and loaded it in the target on-premises environment where the models would be developed. We provided the option of loading the models in the cloud to enable additional ML models and more complex computations. Machine learning model development. Within three weeks, we began engineering the machine learning models, choosing the subset of data most relevant to the question, “Which checking accounts are likely to close in the next 90 days?” We selected the machine learning algorithms, then trained and tuned the model. Model insights integration. With a model in place, we met with with the bank’s stakeholders to present metrics to measure the model’s success, focusing on KPIs. The model returned valuable insight into the accounts at high risk of closing which allowed the bank an opportunity to refer customers to the bank’s retention team This end-to-end process generates daily predictions using real-time data in less than an hour with the bank's existing infrastructure. Educating the Team While providing the machine learning model was our highest priority, we also wanted to help educate the bank's team on key elements of machine learning, such as the process for training models and the process for generating predictions. This would occur organically by working side by side with their team. Optimizing Results The bank wanted to use machine learning to predict customer attrition more effectively than how they could do so through traditional analytics. They also wanted to understand the key metrics for evaluating machine learning models. In the secondary phase, we could optimize the model to capture model efficacy. This will allow the bank to optimize and expand the use case or use the model as a template that can be modified for one of the other use cases. The entire proof of concept took eight weeks, and the bank is now in possession of machine learning models that can be implemented in marketing campaigns in the next phase.
Challenge Managing bank and credit union reserve cash is a complex exercise: manage it too tightly and your institution may be subject to high-interest Federal Reserve borrowing fees. Manage it too loosely and your firm may lose out on substantial interest revenue from parked cash. A wholesale financial services provider served hundreds of credit unions nationwide and typically maintained a large volume of cash in reserve to account for member credit union activity. Because the credit unions conducted business autonomously, the organization was constantly challenged to predict members’ cash reserves without any direct control or visibility. However, technology could provide them with the visibility they needed. More specifically, they wanted to apply advanced analytics to predict member activity and drive better returns on reserve cash. To meet this goal, they partnered with New Era Technology Alliance to find a solution. Solution While the financial services provider could not directly influence credit union spending and borrowing, they possessed one critical element — decades of financial transaction data to support the cash reserve engagement. Company leaders understood there were patterns in the member credit union data based on calendar milestones (payroll activity, mortgage pay activity, etc.) but needed help identifying these regularities in the noise across hundreds of credit unions and billions in cash. We proposed using machine learning to "read" 18 years of historical cash data to predict the next 60 business days of member activity in aggregate and by cash account. This initiative would provide a discrete view for the investment desk to simulate cash and borrowing needs to effectively partner with finance. Our team developed a machine learning algorithm that would account for the entire body of transactions while still favoring more recent data. To develop this algorithm, we landed and cleaned data in our client's Azure Cloud platform and gathered success metrics on a variety of algorithms to achieve the desired liquidity aims for the organization. From there, we selected a long short-term memory (LSTM) recurrent neural network. After achieving the desired metrics for cash management, we moved to the next phase, where our team developed an analytical website solution that: Allowed the company’s finance team to feed new data Exposed long-term analytics with the liquidity for the investment team to effectively manage bank cash in the big picture Secured the environment according to bank best practices Once we completed this step, we developed a weekly retraining process to keep LSTM models current and integrated the solution with a machine learning web service hosted in Azure.
Data management in banking poses unique challenges. You’re dealing with vast amounts of sensitive information, rigid regulations, and security issues, all of which can complicate the process of actually managing and using the data you collect. Given our long experience with data management in financial services, we jumped at the chance to help a regional bank streamline their data strategies. After helping them transform their operations through an enterprise data management program, the client saw a staggering 1,054% ROI over three years. Challenge A regional bank’s need to prevent and reduce credit losses from defaulted commercial loans was symptomatic of a greater challenge. The bank needed a data management program that could help it more effectively manage different aspects of the business. Read on to learn how a new finance data strategy helped our client triumph over the core challenges of 1) meeting stringent regulatory demands for more robust reporting and 2) dealing with issues surrounding its data and data access. A Common Challenge in Banking Each year, banks approve billions of dollars in commercial loans. Throughout the approval process, documents are signed and covenants are created to ensure that funds will be repaid. Funds not repaid within the outlined term can result in higher capital requirements on the institution and, ultimately, credit losses for the bank. Most often, the only indicator that a loan has gone bad is when payments become delinquent, which is too late. Our client wanted to analyze such scenarios well in advance to prevent payment default. And that’s what led to the need for an improved bank data management program. Solution Our team’s assessment revealed that this bank’s ability to quickly uncover and manage credit loss was constrained by a lack of consistent, quality data, and by static reporting and manual processes. In addition to regulation issues, other issues to resolve included: Incomplete and inconsistent data A desire to have more time to analyze data before monthly, quarterly, and yearly reporting An inability to see the “story” behind the data An inability to interact with the data through visualization tools Summary of Deliverables An Enterprise Data Management framework that included culture, people, process, and technology change management The enablement of a new data leader, i.e., a chief data officer Numerous executive dashboards: Status dashboards – accrual, AQR status, charge-off and recovery, delinquency, and others Trend dashboards – commercial portfolio, retail and mortgage portfolio, charge offs, large dollar exposure, etc. Operational dashboards – delinquencies and maturities Alert dashboards – accruals, loan structure alerts, AQR alerts, etc.
Challenge As credit card data processing companies move toward commoditization and omnichannel processes, one of the nation's largest processors saw the value beyond the initial transaction. Each debit or credit card they process, whether for authorization or decline and settlement, carried dark data that offered significant value for merchants. However, that data, along with merchant, bank, and consumer information, was highly proprietary and included data that couldn't be used beyond defining what to authorize and settle. The processing company was at a crossroads. On one hand, they wanted to expand to other channels and create competitive product lines to keep ahead of trends. On the other hand, they were very concerned about risking the security and efficiency of their services. To meet their goals, the company leaders knew they would have to rethink the foundation of how they do business, including analyzing their capabilities and gaps as well as finding opportunities to access data more easily. They needed a data transformation strategy based on credit card data analytics as this would allow them to move quickly and stay ahead of industry trends. It would also allow them to evaluate their existing technologies against newer, more nimble ones and manage risk while maintaining current service levels. While they had already made great strides in mapping out their plan for the future, they needed an experienced team of experts to guide their transformation in a way that would align with their business strategy. To meet these goals, they partnered with New Era Technology to assist them in creating a digital strategy roadmap. Solution Our team knew we needed to build an environment that could meet several demands, including: Process and analyze billions of data transactions Integrate with other internal and external datasets Interpret consumer behavior Preserve the privacy, integrity, and quality of data Ensure obtained analytics and insights were valid The first step to building this environment was identifying obstacles to managing and interpreting data. Next, we defined a framework for governance and controls that complied with strict regulations and would ensure all credit card transaction data and resulting analytics were kept confidential. Once we had the auditing process as the foundation of the digital strategy roadmap, we shaped and refined it into a three-year data strategy and data transformation strategy that would include: Creating the desired data management platform capabilities Creating a critical alignment between IT and the rest of the company Identifying which technologies to repurpose or replace Implementing new technologies to begin data integration and migration Implementing data governance and obfuscating data Identified change agents and thought leaders who would propel the organization to success With a data strategy roadmap in place, our client was able to get ahead of the evolving payment-processing market that would future-proof their organization.
Solution The bank’s leadership needed to understand who their ideal customer was, how to leverage their data, and how to strengthen their sales pipeline. Knowing they needed help from a team of experts, they reached out to Fusion to assess the situation and help them achieve their goals. We knew machine learning would be the best way to identify their ideal customer and find new opportunities in the market. This data science method analyzes historical data to predict or forecast future outcomes, behaviors, and trends. The data learns from itself without human bias, preconceived notions, or explicit instructions. Also, because the volume of data is often massive, machine learning can find the patterns that a person would likely miss. With a plan in mind, we began the process of leveraging machine learning to increase their sales pipeline predictability and stability. Our Goal Define the Ideal Customer First, we needed to define exactly who the stakeholders believed was their ideal customer and determine aligned attributes they identify with their target audience. This insight would allow us to select high-value use cases for the machine learning models. Our Solution Sat down with stakeholders throughout each department of the organization to discuss characteristics of their ideal customer. Brought stakeholders together in a workshop to get everyone on the same page as to how they would identify their ideal customer Selected high value use cases Our Goal Understand their Data Machine learning initiatives are only as successful as the quality of the data. We needed high quality, valuable data that would support the questions we were asking and give us accurate, reliable predictions we could use. So, before we developed the models, we needed a solid foundation in place. Our Solution Identified all available data and performed an analysis to assess data quality and completeness to support the defined objectives. Identified where to find the best sources for data Analyzed information to determine where data needs to be improved or where gaps are located Our Goal Develop a Machine Learning Model With characteristics identified and clean data, we developed a model that leverages the significant characteristics for use against prospects or existing customers. During this step, we allowed customer data to speak for itself. Our Solution Identified data elements that should be input for the machine learning model based on the data profiling Provisioned a cloud environment and developed data ingestion Defined and developed machine learning predictive models that supported the defined use cases Executed the model against real data and assimilated the output to graphically show customer segmentation Our Goal Finalize the Ideal Customer Definition We used stakeholder inputs, data profiling outputs, and machine learning to let data and actual outcomes influence the definition of the ideal customer. Our team explained what the model said and the characteristics of the target customer and how that compared to the stakeholder's thoughts. Using our information, we could create criteria or attributes and the bank could use those to build customer personas. Our Solution Took data from the model to share insight into the true characteristics of their ideal customers. Built machine learning models that would score customer and prospect lists against the customer model Operationalized the model to use in marketing campaigns At the end of the project, we developed the prospect target list and scored them based on the machine learning insights. The company can use this to execute a more informed marketing plan and grow their customer base.
Solution When PFC needed a technology strategy for their banking company, they turned to Fusion Alliance as we have partnered together several times and have developed a collaborative approach to fulfilling PFC's vision. Our expertise has lead their company into several initiatives and adopting functionalities that have paid off immensely over the years, and PFC was excited to see how we could help them with this next project. Our team sat down with PFC and determined their goals, specifically modernizing their platforms and application and transitioning away from their legacy system. More specifically, because PFC manages an investment trading platform, they need to have a reliable platform with access to new products and tools that also complies with strict industry regulations. With objectives and a plan in mind, we quickly got to work. Our Goal Overhaul Existing Systems and Streamline Processes To provide PFC with a clean slate, we developed a single platform with a more functional web interface to replace the legacy systems. This streamlined and automated numerous manual processes, instantly increasing productivity. We also helped PFC migrate to the cloud to eliminate on-premises servers, equipment, maintenance, and security while strengthening disaster recovery through cloud redundancy. Our Solution Removed expensive, geo-redundant datacenters Enhanced efficiency, scalability, and reliability Created significant process improvements Boosted productivity Enriched client and employee experiences Increased brand confidence Widened market reach Our Goal Adding New Product Lines Our team understood that PFC wanted to expand product lines to increase revenue and better serve their clients. With a modern cloud-native platform, PFC could extend their product offerings easily both at present and in the future. Our Solution Update coding Create an extensible framework so PFC could launch new products on their own Implemented Microsoft Power BI's visual analytics for improved reporting Updating security safeguards to protect data Our Goal Driving Sales Through Data PFC wanted to draw more insights, including who does and does not purchase products when interacting with their site and also learn how to improve sales targeting and identify more opportunities to increase sales. Our Solution Improve data access with modern data platform Launch machine learning initiative Tune machine learning model to better pinpoint investors who are more likely to purchase a specific product
Challenge Primary Financial Company manages an investment program for institutional investors to invest substantial funds in federally insured CDs. This break down to managing nearly 40,000 CDs and over $7 billion in assets while supporting relationships with 5,000 financial institutions and institutional investors. They wanted to improve sales targeting to predict both CD issuing companies' funding needs and institutions' desire to invest. As a massive company with a large portion of the industry's market share, PFC had a vast quantity of valuable data at their fingertips, but they didn't know how to leverage it to look into the future to predict consumer behavior. Having partnered with Fusion on several projects in the past, they turned to us once again to explore how advanced analytics and machine learning in financial services could provide data-driven, predictive outcomes. Solution With a key objective in place, PFC and Fusion worked together to explore the following machine learning models: Identify the best issuers for sales solicitation, including former, current, and prospective issuers Provide rate guidance to investors and rate/term guidance for CD issuers Target investors by likelihood of close With a plan in place, our machine learning experts outlined the process, including data acquisition, transformation, model development, and predictive analytics. First, all private and public data sources were identified and acquired to gain insight into current and prospective customers. PFC and Fusion then worked together to determine meaningful and available factors. The next step was to transform the data so these factors would be consistent and accurate. With a solid foundation, Fusion developed machine learning models that would learn and identify patterns, then recognize those patterns when seen again to apply lessons to predict outcomes. We identified over 100 candidate "features" of data from both public and private data sources, then applied "practical analytics" which focuses on data that is applicable to the described use case. Having a clear understanding of the target accuracy of the predictive models was essential to the success of this project, but more important was client participation. By collaborating with our client, PFC learned to quickly identify and understand errant data and numbers to drive the value of our models even higher. Also, by defining the required utility of the models, PFC could realize business value without having to endlessly tune them.
Challenge Over the past several years, classrooms have shifted to a more individualized model of education to better meet each student's needs. While this offered McGraw-Hill an incredible opportunity to demonstrate its commitment to providing customers with state-of-the-art, adaptive learning systems, it also required them to divide their focus from printed textbook and materials and redefine themselves to deliver more digital content and systems that allow personalized learning for students. However, developing, delivering, and maintaining these digital educational programs is time-consuming and costly, and at the time they wanted to break into this market, much of their resources were still devoted to printed textbooks and content. They needed a framework in place so they could continually grow and build out their digital content without over-extending their budget. At the same time, they already had established processes and operations in place to maintain the success of their printed content and didn't want to take away from this essential part of their company. Moving to a digital focus while continuing to deliver high-quality content required McGraw-Hill to improve their processes and products around advanced technology and increase efficiency when building applications and digital content. They knew partnering with a company experienced in developing technology solutions was key, and that’s how the long and collaborative relationship between MHE and our Fusion team began. Solution McGraw-Hill needed to partner with a company experienced in developing technology solutions and turned to Fusion for thought leadership, strategic vision, and to leverage our team of application developers, scrum masters/project managers, quality assurance resources and technical product managers. We initially sat down with them to determine three key objectives : Improve processes and products employing advanced technology Implement scalable frameworks Increase efficiency when building applications and content In order to meet these objectives, we started by leading the architecture and business-case discussion for deploying cloud-native application services that would serve over 4 million students. This project's success led McGraw-Hill to expand their application services across their digital platform group. At the same time, Fusion introduced Agile development principles and the Scrum development framework so they could successfully scale their classroom technology solutions. Our team helped McGraw-Hill continue to grow their footprint in the digital space by assisting them in developing and delivering several products to support their classroom technology endeavor. We also worked with them to build on existing processes and infrastructure to save time and reduce costs compared to building from scratch. Because of the continued partnership with Fusion, McGraw-Hill has quickly developed products that offer strategic value in learning-science initiatives, and now they are poised to compete on a new level beyond printed content and textbooks. Their foresight into digital education enabled them to stand out in a changing market as they continue to partner with Fusion to continue learning-science initiatives.
Challenges Standing out in a Saturated Market National pizza brands have massive marketing budgets and immediate brand recognition. Donatos needed to stand out from those brands, but they didn't have a data-driven strategy in place for how they could differentiate from competitors. Their message and marketing got lost in the noise of competition, often costing them returning customers. Converting New Customers to Returning Customers Donatos wanted to identify customers who were at risk of leaving so they could take action to win them back. Without the ability to derive analytics from their data and gain insight into consumer behavior, strategic planning was unsuccessful. Achieving Company-wide Growth With over 160 stores, Donatos Pizza had many that were succeeding, but weak customer retention meant that many stores were struggling. The leaders needed a solution to increase sales in these stores to achieve consistent and company-wide growth. Solutions Donatos Pizza wanted to look at data solutions that would lead to increased customer retention. Through a prior statistical analysis, company leaders knew that if customers returned within a specific time frame from their prior visit, they were likely to become long-term or loyal customers. Those who didn't return were unlikely to return at all. They wanted to identify at-risk customers and target them with successful strategies to regain their business and loyalty. Fusion Alliance sat down with Donatos to discuss their goals and determine the right course of action. Because they had a vast quantity of customer data readily available, we knew machine learning would offer the best opportunity to identify patterns and predict consumer behavior. Fusion Alliance created a three-month pilot program to implement and apply machine learning models in specific stores. We predicted that at the end of the program, the stores using machine learning would retain 30 percent of the identified at-risk customers. Goal: Identify the Algorithm that would Produce the Most Accurate Predictive Analysis The value of results is directly dependent on the value of data the platform receives. With so much data available, we needed to be very specific in what we used and which algorithms we identified as the most accurate. Our solution: Assessed the quality and quantity of data Developed predictive models Cleaned the data to use as a training set and use it to identify the best algorithm for accurate predictive models. Goal: Accelerate the Process while Reducing Costs Machine learning and predictive analytics are often very expensive and risky. Fusion minimizes risk and cost through a use-case driven strategy where we start with the use case, pull in only necessary data, and create an iterative approach to deliver a working solution that brings more value to the client on a faster timeline. Our solution: Started with the use case. Specifically, identifying at-risk customers Performed an ETL (extract, transfer, and load) on their data to a cloud platform to improve efficiency Evaluated and selected the data most likely to provide accurate insight including pulling in source data, aggregating it, and removing aberrations Goal: Hone in on a Specific Question to use in Machine Learning Models to Identify at-risk Customers Machine learning offers the best outcomes when there is a specific, defined problem to solve. Our solution: Formulate and test questions Iterate over the model to achieve a desired state Run the previous day's sales against the model to produce a list of at-risk customers
Ready To Talk?
Let us know how we can help you out, and one of our experts will be in touch right away.