Articles about AI & Machine Learning
Collaborate and manage your workflows smarter, faster, better with Smartsheet. There are many project management tools available to dive into and it can be a little overwhelming to understand if or which tool will really get the job done. While Smartsheet is a tool that is spreadsheet-based, it’s a very complex project management tool that is very powerful that will result in almost immediate ROI. Exploring Smartsheet has many benefits to ensure projects from end to end are successful and efficient within an organization. Let’s dive into the many capabilities Smartsheet has that will drive success into your next project. Integration and Collaboration With many teams still working remotely, having a productive collaboration and project management tool is key to ensure tasks and milestones are completed efficiently throughout projects. One of the primary benefits Smartsheet offers to users is its ability to share sheets with internal or external team members. Even if some users don’t hold a license, users can still collaborate which is definitely a cost-effective approach when you have multiple projects going on simultaneously with many decision-makers who need access. The access level can also be controlled by administrators which is a critical feature if you have a lot of people involved, but not everyone should necessarily have editable permissions to shared sheets. Smartsheet also has many connectors with key tools users already work with such as Microsoft Dynamics 365, Salesforce or Jira. For example, by using a connector with Microsoft Dynamics 365, users can create workflows and apply filters that control the rows that sync to objects in Dynamics 365 for the purpose of including data only on certain rows with a particular status or value. Workflows also have the ability to sync automatically to keep records organized and updated. As connectors are put into place, it’s critical to also utilize and review user permissions to ensure syncing is working properly and that users are constantly saving changes so only the latest version is viewed on sheets. See an issue? A records history is also kept to review any changes that have been made and the user associated with that workflow so it’s easy to identify the source of any errors. There’s a number of other tools available that allow users to collaborate even further and connect existing databases, CRMs and ERPs with Smartsheet’s Data Shuttle. Data Shuttle is an add-on that allows users to automatically import data from software systems and databases directly into Smartsheet such as actual hours worked from external time tracking tools or expenses from external accounting systems. Data Analytics and Reporting Smartsheet’s utilization reporting is another benefit to managing all of the resources involved in projects. One of the key factors to consider in project management is to gather information regarding inefficiencies or who is doing what for each project. Key stakeholders usually always want to know how much time a project is going to take and the time it’s taking for each step along the way and Smartsheet takes the guesswork out of hours incurred for projects and forecasting. Gone are the days where key stakeholders are making decisions just based on opinion. Data should be backing every decision along the way in order to move forward successfully. If you were to ask a project manager to estimate how long it will take their team to complete a project, they will likely underestimate the actual amount of time it will take which leads to a lot of frustration and a prolonged time to market. Once that real-time data is being collected in Smartsheet, visual dashboards can also be created to visualize a project’s performance when considering various metrics and depending on who is viewing that data. Custom reports can be created to filter out the critical data points needed for each group within an organization. Data visualization is a very useful tool to take advantage of to easily identify issues earlier for preventive measures and to easily translate data into a language that can be easily understood among different groups. Improve Operational Efficiencies With the previously mentioned benefits Smartsheet offers to members within an organization, the benefit of improving and streamlining operations is a given. With the data analytics and tracking of resources involved with a project management tool like this, operational efficiencies will occur organically as long as that data that’s being collected is acted upon. The insights provided with project management tools is in a way priceless when it comes to being able to pivot where necessary when reviewing the reporting provided to support business growth and to avoid issues that before would not have been seen without understanding and bringing all of this data together on one shared sheet or dashboard. To clearly see inefficiencies in any project will allow stakeholders to make critical decisions quickly to decrease time to market instead of prolonging projects and missing deadlines. A comprehensive project management tool should not be seen as a luxury, but a necessity at a time especially when many teams are likely to still be working remotely and to keep track of how critical resources are spending their time and how long tasks are taking. The survey results for PMI’s Pulse of the Profession® revealed an average 11.4 percent of investment is wasted due to poor project performance. And organizations that undervalue project management as a strategic competency for driving change report an average of 67 percent more of their projects failing outright. A project management tool is the solution to eliminating poor project performance and management and organizations can’t see these valuable tools as optional – it’s a critical asset to the business to ensure your business still has a pulse and isn’t on the fast-track to fail. Many organizations especially now work quite lean with resources doing more than one job who have a whole portfolio of ongoing projects going on at any given time. Smartsheet not only give insights to key stakeholders, but it gives your teams the tools to help manage their own time and tasks to avoid micromanaging. Smartsheet is a dynamic tool for various industries to increase output and work more efficiently internally and externally. Multi-layered Security It’s always great to know that a software tool you’re planning on using has access to all of your organization’s databases and how they can talk to each other seamlessly right? Well, for some, especially IT, that may raise some red flags when it comes to possibly increasing the risk of a data breach. With Smartsheets, security is built into the product to ensure that the organization’s data is being protected with multi-layered access permissions and quarterly access audits. Smartsheet also has policies and procedures in place to ensure an organization’s data is backed up in multiple locations with a team constantly evaluating security threats and implementing countermeasures to help prevent unauthorized access or downtime. Among Smartsheet’s multi-layered data security strategy, it also uses encryption to protect data with NIST approved ciphers. According to Statista, reports indicate that there were just over 1,000 data breaches and millions of records exposed in the U.S. last year. When looking at a tool that has access to this much data, an organization should ensure that they are working with a comprehensive data protection strategy that is nothing less than the gold-standard. Not sure where to start? A Smartsheet platinum partner like New Era Technology can help guide your organization through implementing this powerful tool throughout the business and removing any apprehension that comes along with making that first step. The only regret there might be is that this tool wasn’t implemented from day one.
Predictive models tell you more than any tarot reading “Call me now for your free tarot reading!” If you were ever up late watching television in the late 90’s, you remember Miss Cleo, the woman who claimed she could see the future and tell your fortune with an over-the-phone tarot reading. You probably read the opening line in her dramatic Jamaican accent and see her with her crystal ball. Of course, it was “for entertainment only,” but that didn’t stop thousands of people from calling in, hoping for a glimpse of the future to help them make tough decisions. Wouldn't it be great if you could see what the future holds for your business so you can make smart decisions and build strategies that will support upcoming trends and mitigate risk? But if you can’t call Miss Cleo, what can you do? You look into the past. And no, we’re not talking about holding a séance. We're talking about leveraging your data with machine learning. Machine learning analyzes your data to look for trends and patterns that can be used to develop predictive models to forecast economic trends, customer behavior, and marketing campaign success. Learn More: Modern data platforms support machine learning to develop predictive models Get Smart: Want to learn more about the rise and fall of the self-proclaimed psychic? Check out Call Me Miss Cleo, a new documentary on HBO Max that provides an illuminating, in-depth look at the woman, the empire that was built around her, and her eventual downfall.
Predictive models tell you more than any tarot reading There’s a rapidly growing space in tech that doesn’t have a name, but you know it when you see it. We call it “Tech nobody asked for.” Tech nobody asked for is typically modern technology, such as a mobile app, device, or software that doesn’t solve a problem or improve an experience. Often, it creates additional steps or complications. Here are a few examples: Internet-connected hairbrush to “listen” to hair and recommend products Bluetooth water bottle that syncs to a “hydration app” Smart egg tray that syncs to your phone to tell you how many eggs you have and how fresh they are These may be extreme examples, but there are near-infinite examples. Have you been to a restaurant that got rid of printed menus and now requires you to scan a QR code or download a mobile app to see the menu? No one wants a QR code menu. Your business wants to solve customer problems and offer the best experience. Providing a mobile app or other new tech solution seems to be the right answer, but you want to avoid investing in and launching tech nobody asked for. That’s where our technology and marketing teams can help. We identify customer concerns, including obstacles in the buyer journey or gaps in customer experience, and help you build out tech solutions to overcome those issues, including mobile applications or website improvements. Check out this case study to see how Fusion did exactly that. >> If you want to learn more about custom, customer-friendly technology solutions, we can help you get started. Want to see more examples of tech nobody asked for? MIT Technology Review released their picks for the 10 worst technologies of the 21st century. Is there anything on the list you disagree with?
When it comes to understanding how wearables are changing healthcare, consumer brands serve as a solid leading indicator. Popularized by brands like Apple Watch, FitBit, and Garmin, the global wearable healthcare market was estimated at $16.2 billion in 2021, and is projected to double in the next five years. Healthcare wearables in daily life Although most users rely on healthcare wearables to check texts during spin class or crush their friends’ daily step records, an increasing number of users rely on smartwatches and other medical wearables for life-saving medical information. As the technology evolves, healthcare wearables can now give minute-by-minute EKG readings, monitor blood sugar, check oxygenation levels, and help people use real-time data to manage their health while they go about their regular activities. Wearables also deliver oversight and peace of mind to caregivers, as when diabetic children wear devices that monitor insulin and food intake and link to mobile apps monitored by their parents. These breakthroughs allow patients of all ages more autonomy while providing reassurance to caregivers that the person is safe. Learn more about what wearable devices make possible >> Healthcare wearables in long-term care settings Long-term care presents a gap between that at-home monitoring scenario and the tech-saturated acute care space of a hospital or clinic. Historically understaffed, nursing homes and long-term care facilities struggle with high turnover, increasing rates of preventable errors, and unnecessary escalation of avoidable medical events. In addition to the impact on the patient and their loved ones, these realities impact the facility itself through lower reimbursement rates and increased cost of care. A 2021 American Healthcare Association and the National Center for Assisted Living survey on staffing in these facilities showed that 99% of nursing homes and 96% of assisted-living facilities face a staffing shortage. Harvard University professor David Grabowski says the pandemic only worsened that already critical situation. He notes, “We’ve overlooked and undervalued this workforce for a long time and now we’re at a full-blown crisis…We’re in a crisis on top of a crisis.” Ensuring the right level of care for high-risk and elderly patients amidst staffing constraints formed a critical use case for transformation. Healthcare wearables emerged as a leading option that would give staff the ability to monitor more patients, get notifications when care is needed, and escalate when necessary. Overcoming obstacles to adoption Implementing a program for wearable devices in nursing homes introduced more stringent requirements than consumer wearables, including: Privacy protection: Patient medical information, covered under HIPAA, requires more protection than off-the-shelf iOS and Android systems offer. Usability concerns: Patients under care in nursing homes and long-term care facilities often lack experience with technology, and/or the dexterity to manage new devices. Cost considerations: In addition to the cost of patient wearables and devices allowing nursing staff to monitor and communicate alerts, facilities must also invest in secure data infrastructure and information architecture beyond standard integrations in market-ready smartwatches. Creating a targeted solution Realizing that nursing home and long-term care facilities faced unique barriers to implementing wearable devices, BioLink Systems set out to create a solution. Initially, the company devised a device that could be attached to an adult brief to monitor urination levels and body position. However, early issues with the prototype limited production scalability. Fusion worked with BioLink to architect a cloud-based IoT solution that uses machine learning to exceed the company’s initial vision. Designed with a minimalist aesthetic and user experience to fit the target demographic, the BioLink bracelet and adult brief wearables: Meet HIPAA requirements Monitor patient fluids Track patient vital signs Alert nursing staff when patient vitals fall outside their customized range Escalate alerts if patients are not attended within an allotted timeframe Initial testing and rollouts in nursing homes delivered immediate results, including: Improved patient care Decreased response times Fewer avoidable events such as medication errors Decreased escalation of care level, including hospitalizations Improved oversight Increased compliance with state, federal, and agency regulations Better experiences for the patient and their loved ones Learn more about how BioLink’s wearables are changing healthcare >> What’s next for wearable healthcare devices As facilities gather more data from using these devices, the machine learning algorithm BioLink and Fusion designed will continue to refine unique vital sign ranges for each patient, resulting in more targeted care. Future iterations of the BioLink device will integrate that information with the patient’s electronic medical record, enabling further customization of care. While each device starts with a baseline for normal with each of these vital signs, the more data that is collected, the better the facility can care for the patient. For example, if a patient’s oxygen level is continuously high, the device eventually creates a new threshold for that patient’s vitals and only sends notifications accordingly. Especially with the elderly population, there are many people that can’t communicate what they need or when they feel a certain way. There are endless possibilities for being able to provide better care under these circumstances. With options like dehydration sensors, nursing care staff is better able to not only bring water to patients but ensure that they are actually consuming it. The more variables, the better Ultimately, the more variables, the better the information — resulting in better care and better outcomes. The correlation and combination of all the data from a patient can detect changes and allow for more timely, preventative care. And the more information included, the better the insights from the algorithm. With the right information, staff can prevent different medical events by predicting problems and eventually creating better remedies and treatments to avoid costly medical interventions or catastrophic incidents. As industry leaders and healthcare facilities see the impact of devices like BioLink’s bracelets, we expect to see greater adoption of healthcare wearables to elevate patient care, reduce facility costs, and find operational efficiencies even during times of staffing crises. With the right technology and innovation, we can change outcomes and save lives — with a wristband.
As the pace of technological change continues to increase, digital transformation in healthcare often struggles to keep up. Challenges like integrating aging legacy systems, maintaining patient privacy, and leveraging disparate data sources into actionable insights loom large in healthcare, where time and resources are often at a premium. But the same circumstances that make digital transformation in healthcare more difficult are the very things that underline its importance. When patient lives are on the line, digital transformation isn’t just a “nice to have.” Healthcare systems that achieve their digital transformation goals see immediate improvements in patient experience, quality of care, and patient outcomes. From that standpoint, digital transformation in healthcare isn’t just about adding technology, it’s about revolutionizing the processes and systems that drive the health and well-being of the population as a whole. Case study: Life-saving technology in diabetes long-term care >> Putting patients first While individual healthcare providers commonly put their patients’ needs front and center, the system as a whole did not evolve with that mentality. Due to a variety of factors, including payer systems, consolidation, and the regulatory environment, healthcare systems got a reputation for siloed information, duplicate workflows, lack of clarity, and confusion. As healthcare organizations seek to modernize, smart health systems are taking a consumer-centric approach — redesigning patient experiences and pathways while improving care delivery and outcomes using digital technology. Article: Transforming customer engagement in the digital age >> Planning the future of digital transformation in healthcare During the pandemic, industries accelerated digital transformation efforts across the board, and healthcare was no exception. Out of necessity, more medical touchpoints and interactions moved online, from virtual office visits to automated triage to digital paperwork. Now, two years into the new normal, healthcare organizations are taking stock of their progress, appreciating the speed and scale of their efforts, and mapping opportunities for the future. A recent Deloitte study found that 60% of health systems say they are about halfway through their digital transformation journey. In our experience, working with technology innovators and leaders across industries is where things can get messy. Digital transformation is a long game, and organizations often get bogged down at the halfway mark. To keep moving forward and avoid costly wrong turns, healthcare leaders need a fresh vision and renewed roadmap. Evolving digital transformation in healthcare to meet the changing expectations of patients and providers requires a commitment to a digital-first, people-centric approach, but offers great opportunities for continued growth in connection, innovation, and successful outcomes. Based on our experience, we see five key areas where focused efforts can deliver outsized returns for healthcare systems that are mid-way through their digital transformations: 1. Modernize legacy systems to give providers and patients more options While the vast majority of individual healthcare providers and healthcare organizations use an electronic health records (EHR) system, relatively few seamlessly integrate with patient portals. A recent PEW Health Information Technology (HIT) survey found that almost 80% of respondents wanted to access and view their electronic health records through a website, an online portal, a mobile app, or electronically in some other way. Moreover, the same survey highlights a strong desire for their doctors to share information about the patient’s health status. For most healthcare organizations, integrating patient records across practices and within portals is a headache at best. Adding in the other digital interactions that today’s consumers expect — such as automated appointment and prescription workflows, chatbots, pre-filled forms, and instant answers — might seem impossible. Delivering a better patient experience and giving providers greater flexibility with their tools often takes a more strategic view. Rather than layering in more and more technology solutions, smart healthcare organizations take a holistic approach to modernization, creating flexible, modular solutions that give patients and providers more options in the near term while also making future enhancements easier. Case Study: How an AI healthcare company optimized its digital experience >> Article: Modernization challenges and the path forward >> 2. Mitigate risk to build patient trust In addition to technology lag, healthcare systems also struggle to connect patient health information due to regulatory constraints. To maintain HIPAA compliance in the US and GDPR compliance for EU patients, healthcare organizations sometimes limit the very information sharing that would result in higher quality care. To meet patient expectations of data privacy and personal health data security while also delivering on modern expectations for functionality and connectivity, health organizations need to build in best practices for security and governance throughout their technology architecture. While there are myriad ways to approach this issue, a couple of key options deserve consideration: BYOD Policies A 2019 study found that 63% of healthcare organizations sustained a security incident related to unmanaged and IoT devices. Given the rapid acceleration of digital transformation in healthcare since 2020, we suspect that number is much higher today. As healthcare organizations modernize systems and integrate more virtual and IoT solutions into their technology spaces, having a robust and updated BYOD policy becomes more important. Developing a compliant, enforceable strategy is a critical step in your modernization efforts. Case study: Navigating BYOD in a highly regulated industry >> Containerization One way to mitigate risk is to containerize data, workflows, and applications in the cloud. Although the cloud can sometimes get a bad rap for security, a carefully designed strategy puts security first and can prevent any breach from spilling over too far into other parts of your architecture. Article: Maintaining a composable enterprise >> Blockchain Best known in the context of cryptocurrency, blockchain uses a computerized database of transactions to allow secure information exchange without the need of a third party. Applying blockchain technology to the healthcare industry could improve information security management; healthcare data can be communicated and analyzed while preserving privacy and security. Countries like Australia and the UK have started experimenting with blockchain technology to manage medical records and transactions among patients, healthcare providers, and insurance companies. In both examples, decentralized networks of computers handle the blockchain and simultaneously register every transaction to detect conflicting information, keeping records accurate and making them more difficult to hack. Article: Building trust in your data privacy compliance >> 3. Use voice and wearables to enhance patient experience and outcomes Wearable devices and IoT-based health sensors can track a patient’s conditions and activities remotely, from their vital signs and hydration to the onset of a medical crisis event. The data collected can be helpful to healthcare providers and enable them to better guide patient care. Healthcare providers use IoT and wearable data for remote monitoring and preventative care, providing more specific, personalized connections even with lower staff coverage. Machine learning also drives AI-based natural language processing technology in the healthcare space. As more patients become familiar with voice models like Alexa, Siri, and Google Home, healthcare organizations see potential to deploy the technology for tasks like triage and treatment reminders. For example, the UK’s NHS uses voice technology to field common questions, deliver health information, and remind patients to take medication. Case study: Using wearables to improve patient care >> 4. Put data to work for predictive and preventative care Healthcare organizations collect volumes of data but traditionally haven’t used advanced analytics to translate the information into actionable insights. Today’s leading provider systems are exploring how real-time business analytics, predictive analytics, and AI can transform patient experience and how care is delivered. In much the same way that businesses use data analysis to spot trends, forecast consumer behavior, and drive purchasing decisions, healthcare organizations can use the information they collect to understand patient expectations, discover areas of dissatisfaction or waste, and identify opportunities to enhance the overall experience of patients with their facilities. Likewise, providers can use patient data to understand how a unique individual responds to treatment, spot key diagnostic markers, and even predict potential outcomes so that doctors and patients can work together to minimize risk. Article: Data analytics in healthcare settings >> 5. Automate administrative tasks to focus on patient care The growing number of administrative tasks imposed on physicians, their practices, and, by extension, their patients adds unnecessary costs to the health care system. Excessive administrative tasks also divert time and focus away from providing actual care to patients. Tools like Robotic Process Automation (RPA) can help healthcare systems save time and resources in areas such as administration, billing, and human resources — freeing up more time for face-to-face interaction with patients. When it comes to finding the right applications for automation in healthcare, it’s important to keep patient experience at the center of your strategy. Developing a customer-first automation strategy can help create the perfect blend of automated interactions and human interactions that will meet today’s expectations and delight patients rather than frustrate them. Article: Finding the right use cases for automation >> Evolving patient care through digital transformation in healthcare As the digital tools, apps, and resources pioneered during the pandemic continue to evolve, healthcare leaders must continue to push ahead with digital-first, patient-centric investments in technology, integrations, and solutions. Finding the right balance between patient and provider expectations, maintaining compliance, and enhancing patient care requires a mindset that values the patient’s perspective. Ready to take the next step? Get a machine learning jumpstart >> Get a better view of your data analytics maturity >> Refresh your digital transformation roadmap >> Wherever you are on your digital transformation journey, our team of digital, data, and technology experts can help. Ask us your questions about digital transformation in healthcare >>
The pace of change and unpredictable circumstances of the past couple of years have led many companies to rethink their just-in-time approaches to resourcing tangible goods and materials. But why stop there? To scale and adapt fast, companies also need a new approach to how they resource skillsets. One of our clients, PRECISIONxtract, did just that. By taking a just-in-time approach to their shifting skillset needs, the company was able to scale up fast — and minimize risk — in a changing business environment. A right-fit-first approach PRECISIONxtract’s transformative healthcare market access solutions offer patients and providers unprecedented connection to the right medication and resources in clinical settings. To bring that vision to life, PRECISION could have found a series of single-skill vendors or taken the time to recruit and onboard new employees. Instead, they looked for a cross-functional partner that would be a seamless fit with their company culture and that had the right mix of scalable skills. They found that fit with Fusion Alliance. Fusion quickly became an integral part of PRECISION’s team, assembling a group of more than 20 strategy, data, and technology experts to deliver responsive support for a growing set of initiatives. Boosting surge capacity across disciplines Knowing that their flagship product, Access Genius, needed design and functionality upgrades, PRECISION called on Fusion to assess and modernize the application without disrupting the existing business. To avoid downtime and increase speed to market, our team used an Agile process and a model-driven design, in which models from the source code informed modernization efforts. Streamlining the overall architecture not only saved development time, but also made Access Genius easier to deploy to PRECISION’s clients. And, to make the product easier to maintain and cheaper to run, we applied containerization through a microservices model and moved Access Genius to a distributed cloud hosting framework. Our solution provided real-time customer insights that were delivered across a variety of digital channels, in lieu of a people-driven process. This helped take Access Genius: From a complex, cumbersome, legacy monolith into a lightning-fast, distributed, cost-effective, cloud-native solution From a user-driven, database-centric format to a distributed API-based framework, enabling immediate data updates for important cost and coverage changes From a time-intensive customer engagement portal to an intuitive, streamlined, automated process Equipped with a modern, stable, extensible platform, PRECISION was free to explore opportunities for more radical innovation. Disrupting the market with frictionless access to timely data Although Access Genius successfully broke down barriers with data, the solution’s interface required users to navigate a complex dashboard with manual clicks and drop-downs. For pharma teams with limited time to connect doctors to information, seconds count. Working with PRECISION’s product team, Fusion technology experts analyzed the friction point of manual navigation and explored ways to make Access Genius more seamless for the user. Drawing on deep expertise deploying cutting-edge technologies into highly regulated spaces, Fusion suggested exploring a shift away from a traditional web-based interface to an AI-enabled voice functionality that would connect users to the most relevant data and messaging right in the flow of conversation. Changing the way pharma enablement tools go to market At the same time, other Fusion consultants were hard at work rethinking the way PRECISION’s products reached, empowered, and retained customers. We brought in a range of specialists to bring new strategies to life, including: Instructional designers and training developers created an interactive training platform to equip pharma sales reps with greater confidence in provider interactions by deepening their understanding of the Access Genius tool. RESULT: Access Genius IQ, a new training tool that helps PRECISION customers see faster ROI for their Access Genius investment Brand experts, visual designers, content strategists, and web developers elevated visual brand elements and created websites, editorial content, and outreach campaigns. RESULT: New website architecture, design, and content; long-form lead generation content; prospect cultivation email marketing Digital marketing strategists, creative designers, and ad teams implemented innovative ad campaigns in rapid succession as PRECISION had more time to develop and roll out new products. RESULT: LinkedIn ad campaigns generating 3X leads, including 100 qualified leads in the first 90 days Read more about the success of Fusion’s marketing partnership with PRECISION >> Reimagining the skillset supply chain Partnering with Fusion gives PRECISION access to a huge team of experienced consultants with a wide range of skillsets — allowing the company to surge and scale as their business needs and market realities shift. With Fusion bringing in the right people at just the right time, PRECISION saves valuable time and resources, enabling them to be more innovative, more agile, and more impactful for their customers, healthcare providers, and patients. Ready to explore how Fusion skillsets can help your team succeed? Our ongoing work with PRECISIONxtract is just one example of how we help companies build momentum for a digital-first world. We bring big-picture thinkers, technology-minded creatives, data scientists, and technical experts to work alongside our clients, providing a force-multiplying effect that leads to scalable, future-focused solutions for the most complex challenges. Ready to get started? Let’s talk.
While artificial intelligence (AI) continues to linger in popular imagination in the form of humanoid robots, in real life AI more often exists as a process enabler. Over the past several years, as costs democratized the technology, AI and related emerging technologies like machine learning (ML) and deep learning (DL) became more accessible to mid-market companies. Today, most businesses use AI in one capacity or another — streamlining work, minimizing risk, and gaining competitive insights. These innovations are more than buzzwords. They have powerful potential to revolutionize the way your business collects, processes, and acts on data to solve the real problems facing your business. AI, ML, and DL in the business context To find the right AI applications for your business, it helps to understand your options. Artificial Intelligence Machine Learning Deep Learning Definition Machines programmed to be “smart” Machines that learn from experience provided by data and algorithms ML applied to larger data sets and using multi-layered artificial neural networks Common Examples Smartphones, chatbots, virtual assistants Spam filters, online purchasing recommendations Alexa, Google translate, facial recognition, self-driving cars Example Use Case Configuring a CMS to deliver personalized website experiences using available data points Discovering patterns in data such as “customers who buy X also buy Y,” purchasing cart analysis Processing a large volume of unstructured data, such as images or voice recordings, to generate insights Limitations Machine can only act on specific rules provided Humans must input data parameters as a starting point Requires very powerful – and expensive – computational resources How machine learning differs from AI “ML is the science of getting computers to act without being explicitly programmed.” Stanford University Machine learning takes a different approach to developing artificial intelligence. Instead of hand-coding a specific set of rules to accomplish a particular task, ML trains the machine using large amounts of data and algorithms that give it the ability to learn how to perform a task. Over the years, algorithmic approaches within ML evolved from decision tree learning, inductive logic programming, linear or logistic regressions, clustering, reinforcement learning, and Bayesian networks. Currently, machine learning uses three general models: Supervised learning: Humans supply factors until the machine can accurately apply the distinctions (for example, defining what counts as spam to a filter). Unsupervised learning: The system trains itself on provided data, which is used to surface unknown patterns, as in clustering and association. Clustering looks for patterns of demographics in data and how they predict one another, as in targeting groups of customers with products they will likely need. Association uncovers rules that describe data, as in online book or movie recommendations based on previous purchases and purchasing-cart predictions. Reinforcement learning: Using complex algorithms, the system learns through trial and error toward a defined “reward” of success. Cycling quickly through mistakes or near mistakes, the machine adjusts the weight of the previous results against the desired outcome. How deep learning works As another method of statistical learning that extracts features or attributes from raw data sets, deep learning builds on ML frameworks. While ML requires humans to provide desired features manually, DL uses even more complex algorithms and achieves more sophisticated results without human input. Deep learning algorithms automatically extract features for classification. This ability requires a huge amount of data to train the algorithms and ensure accurate results. To process this volume of data, DL requires specially designed, usually cloud-based computers with high-performance CPUs or GPUs. Using multi-layered artificial neural networks inspired by the biology of the human brain — specifically the organic interconnections between neurons — deep learning trains artificial neurons to identify patterns in information to produce the desired output. Unlike the human brain, artificial neural networks operate via discrete layers, connections, and directions of data propagation. Three common types of artificial neural networks and DL processing applications are: Convolutional neural networks (CNN) are deep artificial neural networks that are used to classify images, cluster them by similarity, and perform object recognition. These algorithms navigate self-driving cars and enable facial recognition, but are also used in leading-edge medical applications such as identifying tumor types. Generative adversarial networks (GAN) are composed of two neural networks: a generative network and a discriminative network. While GANs can be used negatively as in the creation of “deep fake” photos and video, organizations can also use GANs to create privacy-safe data pools for ML. Natural language processing (NLP) is the ability to analyze, understand, and generate human language, whether text or speech. Alexa, Siri, Cortana, and Google Assistant all use NLP engines, and many businesses are exploring ways to incorporate voice into their proprietary applications and digital solutions. Make smart decisions about AI New Era Technology provides cloud infrastructure and emerging technology solutions that accelerate your digital transformation. Our teams help businesses across a wide variety of industries uncover the best use cases for AI, and the right emerging technology solutions to meet your goals. We can help you source, clean, and integrate your data, build and train machine learning models, and iteratively test and improve your solution to maximize results. Not sure how this might work for your business? Check out these real-world examples: Find out how machine learning helps a national pizza chain retain customers >> Discover how AI transforms business processes >> Explore the future of wearables and mobile ML technology >> Learn how ML can help businesses predict sales pipelines >>
Are you using coconuts? Make your quest more efficient. In the classic film Monty Python and the Holy Grail, viewers hear King Arthur and his trusty servant Patsy approaching with a trademark “clip-clop, clip-clop" sound. When the duo emerges from the primordial mist, you see (spoiler alert) that the source of all this noise is not, as might be supposed, a horse. Rather, Patsy is banging two coconut shells together as the king trots about on his own two legs. The duo is getting from point A to point B in their quest, but not in the most efficient or effective way possible. So many companies follow that script. Equipped with buzzword mandates like process optimization and data-driven decision making, it’s all too easy to make small adjustments that sound like you’re headed in the right direction but aren’t necessarily getting you there any faster. How do you drop the coconuts and get on the horse (metaphorically speaking)? What does it look like to use data to drive optimization in real terms? We’ve got our eye on digital twins. Before you run away (how’s that for a deep cut Monty Python reference?) from yet another data buzzword, it’s worth another look at this practical application of machine learning and data analytics. Digital twins are most often used to optimize physical assets and processes like manufacturing, warehousing, and logistics. Using sensors to collect data on a product, machine, or physical process, the digital twin feeds real-time data to a machine learning algorithm to test variables and scenarios faster – ultimately leading to actionable process improvement insights. These days, we’re starting to see more businesses use digital twin frameworks to optimize and innovate non-physical business processes like accounting, HR, and marketing as well. A digital twin simulation can help you surface interdependencies and inefficiencies that might otherwise be blind spots, especially if they’re baked into your business culture as “the way we’ve always done it.” In the quest for digital transformation, don’t settle for coconuts. Instead, let’s talk about the ways your data can carry more of the weight for you. Get smart: If all this talk of Monty Python and the Holy Grail puts you in the mood for an old- school movie night, good news: it’s available on Netflix. And if you’re looking for a more literary scratch for your Middle Ages (ish) itch, we’re reading Cathedral by Ben Hopkins. It’s a fascinating look at the complex processes involved in constructing architectural marvels in the days before edge computing. We may handle optimization differently now, but human nature stays the same.
Almost 20 years ago, Capital One recognized the need for one person to oversee their data security, quality, and privacy, and the role of the Chief Data Officer was born. Now reports show that 68% of organizations have a CDO (Harvard Business Review, 2020). And while the role has become more common and has significantly evolved, many data executives are still struggling to get a seat at the table or bring data to the forefront of their organization. In fact, in a recent survey, only 28% of respondents agreed that the role was successful and established. Company leaders agree that there needs to be a single point of accountability to manage the various dimensions of data inside and outside of the enterprise, including the quality and availability of that data. But now we are at a crossroads — what is the best way to align the work that the CDO does with the strategy of the business as a whole? The reality is that CDOs often struggle to find the internal and external support and resources needed to educate others to align with the organization’s goals. Implementing enterprise data governance, data architecture, data asset development, data science, and advanced analytics capabilities — such as machine learning and video analytics — at scale, is not an easy task. To be successful, data executives need support, resources, and communities focused on the elevation of data. We are proud to continue to help these communities come to life for the benefit of our colleagues and clients, establishing local resources here in the Midwest with global scale, reach, and impact. Read on as Mark Johnson, our Executive Leader for Data Management and Analytics, provides insight on the current state of data and the CDO, and provides details on multiple opportunities for data leaders of different levels to get more involved in the data community. Q: How has the role of data changed/evolved for organizations? The reality is that information is everything. This global pandemic proved that to many organizations. For some, it showed that their digital network was ready, and they were aptly prepared to take on COVID. For others, it has forced them to recognize their own immaturity with data and analytics. On its own, managing data is not exciting — the information just sort of exists. To give data value, you have to put it to use. And so, I think we are going to see the Chief Data Officer and/or Chief Data Analytics Officer really find their own in the coming years. It’s time for their seat at the table. The C-suite is now asking questions that can only be answered with data, and now they truly understand both the value and consequences of the data game. Q: What do you think are the biggest challenges facing CDOs/data leaders today? I think that the biggest challenge for data executives today is the acquisition of talent that is seasoned and experienced where you need them to be for your organization. Higher education hasn’t necessarily kept up with the data world, and often times it takes additional training to reach the right levels. The reality is that right now the talent is manufactured in the real world. Data executives have to be connected and equipped to mentor, train, and keep the right people. Q: You’ve mentioned that data leaders need to connect with each other. What value can people expect from these data communities? I think there is tremendous value. As we are seeing the power of data evolve in organizations, and the role of data leaders evolve as well, I think coming together to collaborate and share elevates the leader, the organization, and the view of data as a whole. In these communities, it gives people a safe space to talk about how they are doing, what they are doing, what their biggest challenges are, and what solutions are working for them. These communities have truly become both a learning laboratory and an accelerator for data. Q: As a big proponent of connecting data leaders, you have been involved in creating different opportunities for people to get together. What groups/events would you recommend, and how can people get involved? I personally have been involved with the MIT Chief Data Officer and Information Quality Symposium (MIT CDOIQ), which is such a great opportunity to start with for connection. It has developed into additional opportunities for data leaders at all levels to get involved and create the kind of community we need to truly elevate the value of data. Organizations like the CDO Magazine, the creation of CDO roundtables across the nation, and the International Society of Chief Data Officers (isCDO) all evolved from connecting data leaders and identifying common challenges. MIT CDOIQ: The International MIT Chief Data Officer and Information Quality Symposium (MIT CDOIQ) is one of the key events for sharing and exchanging cutting-edge ideas and creating a space for discussion between data executives across industries. While resolving data issues at the Department of Defense, the symposium founder, Dr. Wang, recognized the need to bring data people together. Now in its 15th year, MIT CDOIQ is a premier event designed to advance knowledge, accelerate the adoption of the role of the Chief Data Officer, and change how data is leveraged in organizations across industries and geographies. Fusion has been a sponsor of this symposium for seven years now, and we are so excited to see how the event has grown. Designed for the CDO or top data executive in your organization, this is a space to really connect with other top industry leaders. CDO Roundtables Fusion has always been focused on building community and connecting people. And when one of our clients, a Fortune 500 retailer, mentioned wanting to talk with other data leaders from similar corporations, we realized that there was a big gap here — there was no space that existed where data leaders could informally come together, without sales pitches and vendor influence, and simply talk. That’s how the CDO roundtables were born — a place that allows data leaders to get to know each other, collaborate, accelerate knowledge growth, and problem solve. We just started two years ago in Cincinnati, but now, now we’ve expanded to multiple markets including Indianapolis, Columbus, Cleveland, Chicago, and Miami. These groups are designed for your CDO/CDAO and truly create an environment for unfiltered peer-to-peer discussion that helps solves data leadership challenges across industries. If you’re interested in joining one of these roundtables or starting one in your market, email me or message me on LinkedIn. I’m here and ready to get these roundtables started with executives in as many communities as I can. The more communities we have, the more data leaders and organizations we can serve. International Society of Chief Data Officers (isCDO) Launched out of the MIT CDOIQ symposium, the isCDO is a vendor-neutral organization designed to promote data leadership. I am excited to be a founding member of this organization, along with our Vice President of Strategy, David Levine. Our ultimate goal is to create a space that serves as a peer-advisory resource and enables enterprises to truly realize the value of data-driven decision making. With multiple membership options available, isCDO is the perfect opportunity for data leaders looking to connect with their peers and gain a competitive advantage by focusing on high-quality data and analytics. CDO Magazine I am really proud to be a founder of the CDO magazine, as it really is a resource for all business leaders, not just the CDO. We designed the magazine to be a resource for C-suite leaders — to educate and inform on the value proposition, strategies, and best practices that optimize long-term business value from investments in enterprise data management and analytics capabilities. Check out the publication here. And if you’re interested in contributing content or being interviewed, let me know at email@example.com. Closing: The role of the CDO is integral to organizations, but it’s still evolving. Now more than ever, it is important that data leaders come together to collaborate and problem-solve. Fusion is excited to be a part of each of these initiatives, and we are committed to being an agent of change in the communities we serve and beyond. By connecting global thought leaders we believe that organizations will realize the value of data to power their digital transformation. If you’re interested in joining any of these data communities or just have questions, feel free to reach out to Mark via email or on LinkedIn.
In a few short years, hyperautomation, or intelligent automation, has gone from a relatively unknown term to a word used across the technology spectrum. Gartner’s Strategic Technology Trends for 2020 named hyperautomation the #1 strategic technology trend for the year. Gartner also forecasted that the hyperautomation software market will reach nearly $600 billion by 2022. What’s fueling the investment? Organizations are trying to remain competitive by decreasing costs and increasing productivity. A focus on hyperautomation can address business challenges and improve operational efficiency, not to mention elevating the customer experience. “Hyperautomation has shifted from an option to a condition of survival,” said Fabrizio Biscotti, research vice president at Gartner in a recent press release. “Organizations will require more IT and business process automation as they are forced to accelerate digital transformation plans in a post-COVID-19, digital-first world.” The foundation of hyperautomation With Robotic Process Automation (RPA) at its core, hyperautomation incorporates advanced technologies — including artificial intelligence (AI), machine learning (ML), natural language processing, optical character recognition (OCR), process mining, and others — to not only automate tasks typically completed by humans but also to build intelligence into the processes, as well as the information derived from those processes. By building on RPA, hyperautomation elevates workflow automation to make decisions previously made by people. It augments the power and value of what RPA provides with a proven path to applying AI to improve business operations. Hyperautomation and digital transformation Because of the level of automation that can be achieved, hyperautomation is commonly referred to as the next major phase of digital transformation. And, it’s an intricate process. Organizations must implement automation simultaneously on multiple fronts to reach the end goal of hyperautomation. They often need to partner with digital innovation advisors and technology consultants to create a hyperautomation strategy from top to bottom and take all of the organization’s nuances into account. To achieve scalability, disparate automation technologies must work together. Careful planning, implementation, and improvement of processes are accomplished through intelligent business process management (BPM). BPM is a core component of hyperautomation and supports long-term sustainability and operational excellence. The combination of BPM solutions with low-code, RPA, AI, and ML has become a driving force for digital transformations, integrating essential data, connecting your workforce, and developing applications. It is up to technology leaders to create a clear strategy, set objectives, and prioritize actions across all business operations. Doing so ensures that the application of automation is efficient. Employees on the front lines are also in an excellent position to identify which processes would provide the most benefit from automation. This can be supported by implementing a demand management solution. Then, it can then be synchronized with organizations’ change management to ensure employees understand the changes and are prepared for more advanced processes, thus elevating the workforce. Organizations may be wary of the costs of change on such a large scale, but the process of integrating technologies does not always require creating a new infrastructure to replace manual operations. Many RPA, AI, and ML solutions can be integrated into automation and technologies that already exist. The future of hyperautomation The next generation of hyperautomation includes support for more complex processes and long-running workflows. Software robots will be able to interact with business users across core business functions, directly impacting the customer experience. Hyperautomation represents the next step in intelligent automation and will transform how we will work in the future. It allows businesses to protect their investments through a holistic approach to digital transformation. As hyperautomation becomes more prevalent, we will realize a seamless and equal blend of robotics, human employees, and existing systems, which will all work collaboratively in a way never seen before. No matter your industry, hyperautomation is worth consideration for its potential cost savings, intelligent processing, intelligence mining, employee efficiencies, and customer service improvements. Learn more about how hyperautomation technologies like ML and AI can benefit you.
Advances in mobile health technology have transformed the entire landscape of healthcare, including the ability of physician groups, employers, nursing home facilities, and pharmaceutical companies to capture data in the healthcare space. Gone are the days of paper tracking for your glucose levels and blood pressure. Instead, wearable devices like watches and trackers can seamlessly provide real-time data streams to applications and third parties. The data collected from wearables can be used for clinical research, patient monitoring, and wellness tracking, among other uses. Each data point collected can add complexity to your broader data set. Because of the amount and complexity of data, turning to machine learning (ML) can help organizations leverage their data to identify patterns and make data-driven decisions. By applying machine learning techniques to wearable device data, we can now surface patterns in big data and make predictions about behavior. Machine learning enables healthcare-related industries to leverage wearable device data and identify trends, improve recommendations, and define research outcomes. Popularity of wearable devices Wearables are popular, and their adoption continues to grow. Globally, the wearable technology market is expected to grow from $69 billion in 2020 to $81.5 billion in 2021, an 18.1% increase, according to the latest forecast from Gartner. What’s fueling the growth? Demand for smart devices in the healthcare sector is rising, as is demand for Internet of Things (IoT) devices. Many devices are not fitness-specific, featuring notification of text messages, push notifications for mobile apps, and the ability to pay for items by scanning a QR code with Google Pay or Apple Pay. As such, they have broad appeal. “As a result of the pandemic, we have seen wearable devices become much more than just activity trackers for sports enthusiasts. These devices are now capable of providing accurate measurements of your health vitals in real time. Improved measurement accuracy coupled with the latest advancements in ML make it possible to detect abnormalities before they lead to a major health event.” – Alex Matsukevich, Fusion Alliance Director of Mobile Solutions Types of data collected by wearable devices There are a variety of brands and categories of wearable devices, from mass-market consumer versions to highly specialized types created for niche uses. Apple, Fitbit, Google, Samsung, Garmin, LG, Sony, and Microsoft dominate the market. Though the concept of “wearables” includes a focus on wristwatches, while exercise equipment, glasses, and textile sensors are also becoming more common. Wearable devices can measure: Sleeping patterns Heart rate Irregular heart rhythms Location/route during exercise Pace, stride, and distance while moving Blood oxygen levels Falls Limitations of wearable device accuracy Wearables do have limitations, and accuracy is a concern. Healthcare decisions made using erroneous data could have outcomes detrimental to a patient’s overall health. A study from the University of Michigan reviewed 158 publications examining nine different commercial device brands. In laboratory-based settings, Fitbit, Apple Watch, and Samsung appeared to measure steps accurately. Heart rate measurement was more variable, with Apple Watch and Garmin being the most accurate, and Fitbit tending toward underestimation. But for energy expenditure (calories burned), no brand was deemed accurate. This does not mean that the results are invalid, but that there is a significant difference between results from wearables and clinical results in a lab setting. Wearable devices are constantly upgraded and redesigned as technology improves. And data collected by wearables does not provide a clinical diagnosis. As such, this data is just part of the larger picture of health and can be used only in conjunction with other factors to evaluate your overall wellbeing. Overcoming the biggest challenge of wearable device data analysis Healthcare professionals are already using ML to analyze data for patients. Research published in the International Journal of Research and Analytical Reviews confirms that ML techniques are successful in predicting health conditions such as heart disease, diabetes, breast cancer, and thyroid cancer. The biggest hurdle to incorporating device data into broader data sets is the addition of new inputs, such as hours of sleep or total steps walked per day. Traditional data points such as total cholesterol or blood pressure readings are less frequent, so there is a smaller amount of data overall. The challenge is finding how to best incorporate it into other data sets to create a more comprehensive picture of health. The future of wearable device data and machine learning We can glimpse into the future of wearable device data and machine learning with Microsoft’s recent patent filing. Their potential product aims to provide wellness recommendations based on biometric data, such as blood pressure and heart rate, pertaining to work events. To do this, Microsoft requests access to applications used by employees. Microsoft then tracks data points such as: Duration of time spent writing emails Number of times a user refreshes their inbox Time spent reading emails Number of corrections made when writing emails Recipient list for emails Number of meetings in a day Tone of language in emails By combining this information with biometric data (from a secondary device such as a Fitbit or Apple Watch) and machine learning, Microsoft could begin to understand what work events trigger a response. For example, suppose an employee received an email from their manager. Microsoft might observe that the employee spent a higher-than-average amount of time reading the email and that the employee’s heart rate was also elevated during this time. Based on these insights, Microsoft could propose recommendations for helping employees manage stress levels, highlighting events that trigger anxiety. Patent filing sample image from Microsoft outlining tips and recommendations to improve employee wellness With a broad user base using both Office and Teams already, Microsoft has a deep understanding of work-related events. As Facebook built their business making sense of our social lives, Microsoft has the potential to optimize our work lives. “Wearables combined machine learning will become the new standard in personalized consumer electronics, rapidly increasing in popularity and scale every year until then. An integrated device of the future will be able to get a baseline of your health and will alert you to any abnormalities present. We already see this happening with the new Apple Watch, and it will be very soon that this technology becomes commonplace.” – Michael Vieck, Fusion Alliance Software Developer Wearable devices will transform healthcare experiences Data is the key to predicting, understanding, and improving health outcomes. IBM Research anticipates that the average person will generate more than 1 million gigabytes of health-related data in their lifetime, equivalent to 300 million books. The sheer volume of data means that machine learning will be vital in making sense of it. Paired together, wearable devices and machine learning have the potential to transform healthcare experiences. Today’s applications and uses are only the beginning. Read more: Top 3 reasons to invest in machine learning for mobile Machine learning and wearable devices of the future Wearing Your Intelligence: How to Apply Artificial Intelligence in Wearables and IoT
While every organization’s journey to digital transformation looks different, one thing remains the same — the importance of data. Tackling your data systems and processes is vital to fully transform. However, the reality is that most organizations are overwhelmed with data about their customers. But these troves of information are completely useless unless companies know that the data they have is accurate and how to analyze it to make the right business decisions. In today’s world, organizations have been forced to pivot and have realized the value data can bring to drive insight and empower their decision-making. However, many organizations have also recognized their data immaturity. So how do you move forward? The role of data in digital transformation Data can be your organization’s biggest asset, but only if it is used correctly. And things have changed. A lot of organizations have completed the first steps in their digital transformation, but now they are stuck — they aren’t getting the results they expected. Why? They haven’t truly leveraged their data. According to Forrester, “Firms make fewer than 50% of their decisions on quantitative information as opposed to gut feelings, experiences, or opinions.” The same survey also showed that while 85% of those respondents wanted to improve their use of data insights, 91% found it challenging to do so. So, now that you’ve got the data, how can you make it more valuable? Data strategy is key to your digital transformation With so many systems and devices connected, the right information and reporting is critical. But first, you have to make sure you have the right technology in place. Utilizing big data Although you might feel inundated with the amount of data you have coming in, using big data analytics can bring significant value to your digital transformation. Through big data analytics, you can get to a granular level and create an unprecedented customer experience. With information about what customers buy, when they buy it, how often they buy it, etc., you can meet their future needs. It enables both digitization and automation to improve efficiency and business processes. Optimizing your legacy systems Legacy systems are critical to your everyday business, but can be slow to change. Why fix what’s not necessarily broken? But just because systems are functioning correctly doesn’t mean they’re functioning at the level you need them to — a level that is conducive to achieving your data and digital transformation goals. This doesn’t have to mean an entire overhaul. You’ve likely invested a lot into your legacy systems. One key to a good data strategy is understanding how to leverage your legacy systems to make them a part of (instead of a roadblock to) your digital transformation. With the enormous scale of data so closely tied to applications, coding and deployment can often make this stage of your digital transformation feel overwhelming. Sometimes DevOps tooling and processes are incompatible with these systems. Therefore, they are unable to benefit from Agile techniques, continuous integration, and delivery tooling. But it doesn’t have to feel impossible — you just need the right plan and the right technology. Focusing on your data quality Even with the right plan and technology, you have to have the right data. Bad data can have huge consequences for an organization and can lead to business decisions made on inaccurate analytics. Ultimately, good data needs to meet five criteria: accuracy, relevancy, completeness, timeliness, and consistency. With these criteria in place, you will be in the right position to use your data to achieve your digital transformation goals. Implementing a data strategy with digital transformation in mind So how do you implement your data strategy? You should start by tackling your data engineering and data analytics. The more you can trust your data, the more possibilities you have. By solving your data quality problem, you can achieve trust in your data analytics. And then, the more data you have on your customers, the more effective you can make your customer experience. But, this all requires a comprehensive data strategy that allows your quality data to be compiled and analyzed so you can use it to create actionable insights. The biggest tools to help here — AI and machine learning. The benefits of a data-driven digital transformation The benefits of investing in your data are clear, including increased speed to market, faster incremental returns, extended capabilities, and easier access and integration of data. Discover more about the different ways you can invest in your data and improve and accelerate ROI for your organization. Ultimately, your goal is to elevate how you deliver value to your customers. Digital transformation is the key to understanding your customers better and providing a personalized customer experience for them. Leveraging your data can make all the difference between you and your competitors. And we’re here to help. Learn more about how some of our clients have benefited from investing in their data and digital transformation.
The pandemic made it clear that traditional banking is a thing of the past. Online banking had already been on the rise, but a 200% jump in new mobile banking registrations in April 2020 established that customers are able and willing to change. As more Americans bank virtually, banks are fighting to meet customer demand? And beyond the challenges set forth by the pandemic, “digital natives” like Rocket Mortgage, Venmo, Stripe, and Robinhood are all vying for business. These technology-forward organizations position themselves differently from traditional financial institutions and are attracting a younger user base for their services. But a traditional bank has advantages over these challengers: Familiarity and history: Your personal relationships and history with customers mean that your bank is often more aware of their history. And customer questions can be answered in person instead of being routed through a call center. Deep and rich data: Historical data can prove invaluable for ML efforts. Customer deposit amounts, payments, and balance information can be used to predict future behavior. Preference for personal banking: Customers, especially those with a high net-worth, may have discomfort with digital channel dependency for wealth management. A new brand might be a risk, and they could feel uncomfortable not having a specific person to call if something goes awry. As we get back to our “new normal,” traditional banks can use the rich data and relationships they have with customers to their advantage. Forward-thinking leaders are reimagining what it looks like to do business, and they’re using machine learning to elevate the customer experience. Discover how you can use machine learning to create engaging and profitable relationships with your customers. Every bank can find value from machine learning Machine learning might sound like a type of data analysis useful to only the largest of organizations, but its concepts can scale to meet the needs of small and mid-sized banks too. When we use the word machine learning (ML), we are referring to machines and systems that can learn from “experience” supplied by data and algorithms. In banking specifically, ML algorithms can be used to identify patterns in data beyond what humans are capable of observing, and these learnings can be applied to new data sets. It is now possible to improve the customer experience using ML. By parsing customer transaction data, ML can identify clues and patterns ahead of time, even before the customer considers taking action. For example, the process of buying a home and obtaining a mortgage might begin with small savings accumulation or an increase in deposit amounts from wages. ML models can assess banking-specific data like credit patterns, risk tolerance, and price sensitivity, and can be coupled with demographic data like age, median income, and distance to branch. The goal of using ML data in this use case is to target prospective customers with offers most relevant to their situation and stay ahead of customer demands. Knowing where to begin — and where to focus efforts Machine learning has such a wide variety of applications, it can be difficult to know where to start. Identifying a use case for customer-focused ML expenditures is a good first step. In general, we have found that you can benefit from starting with a use case with low or medium relative complexity. Examples focused on improving the customer experience include: Predicting service line interest (HELOC, mortgage refinance, etc.) Streamlining loan approval processes Increasing lines of credit Improving fraud alert notifications With so many use cases to choose from, it can be easy to get lost in the planning for each example. Instead, try focusing on one area at a time. Using your strengths, combined with ML concepts, you can deliver an optimal customer experience that digital challengers just can’t match. Need help getting started? Check out our Machine Learning Jumpstart program. Cross-selling across the relationship with machine learning You can leverage machine learning to determine not only which customers would be a good fit for a mortgage loan, but also the other products that customers might need. Thinking about a mortgage loan, a Home Equity Line of Credit (HELOC) might be a good match for a new homeowner. In any case, the message and product can be tailored to meet the customer’s specific needs. Another part of cross-selling is to personalize the offer based on the customer’s history and propensity to buy. Perhaps an interest rate would be meaningful for one type of customer, while a waived application processing fee would entice another. For individuals who identified as being interested in high revenue products, the marketing effort can be even more personalized, like a phone call or an in-person event invitation. Applying machine learning in real life The following illustration is an example of how an internal dashboard might appear to a banker or service representative. For any specific product, each person has a percentage likelihood that they will take action. Individual model scores are shown, along with next steps, such as outreach about an investment account, or mailing a promotion about mortgage rate refinancing. In this example, marketing inputs, like website data, are combined with transaction and deposit information. When a banker or service representative encounters a customer, either in person in on the phone, they can suggest specific next steps, or ask if the customer has questions. Having a dashboard with this information enables banking employees to be empowered to guide the conversation with data in real time. Related Case Study: Machine learning predicts outcomes at Primary Financial How does a financial services firm improve sales targeting to predict its clients’ desires to invest? Machine learning was the answer for PFC. Learn more. FAQs about machine learning and banking Does the machine learning process work fast enough to enable real-time benefits? For all but the most complex scenarios, yes! Normally, ML is fast enough to be integrated into real-time transactions. Does machine learning get in the way of compliance requirements? In general, no. By using existing data that you obtained or using your data in coordination with third-party data, you are not running amiss of privacy and compliance concerns. How do we ensure the use case we pick machine learning is right given that there are so many to choose from? We recommend focusing initially on those low-cost, high-ROI use cases with a low-medium relative complexity. Given additional experience, context, pipelines, and an understanding of how advanced analytics programs operate, then more complex initiatives can be undertaken. Data reliability can be a concern. Using low-quality data is not advised, but it is possible to start projects with small data sets. Engaging with a third party to evaluate your situation is advised in situations like this. Reimagining customer insights & relationships Banks that employ machine learning will have a portfolio of more customers than ever who are positioned for a variety of banking products delivered in a digital, personalized, and meaningful way. Now is the time to act and implement machine learning to meet customers where they are, using the contact methods that they desire and delivering the products and services to best meet their needs. Need help building your use case and plan? Access our Machine Learning Use Case Guide for Banks. Want to dig deeper? Check out our webinar on this topic.
The financial industry has faced waves of changes over the last two centuries. Emerging nations, the American gold rush, the power of the stock market, and even the Great Depression have all shaped how banking works and what consumers expect from their banks. Notably, from 2015 onward, bankers began to list technology risk among their top five concerns1. While these changes have increased banking access and options for the average consumer, they also brought in more tech-savvy competition and greater regulatory scrutiny as heaps of data have become digitally accessible. Ironically, the very technological disruption that has so upended the financial industry will also be what brings new opportunities for growth and increased wallet share. This is especially true with advanced data tools such as artificial intelligence (AI) and machine learning (ML); according to one source2, 83% of early AI adopters have already achieved substantial (30%) or moderate (53%) economic benefits. In light of the benefits machine learning can bring, we’ve compiled four major areas where we’ve seen ML used to reduce costs, increase revenue, and mitigate risk for banks. 1. Acquire new customers Gone are the days where marketing was limited to just a few channels; now banks must maintain an omnichannel presence in order to reach younger consumers who may not listen to the radio or watch TV. Acquiring new customers means reaching them where they are with messaging that’s highly targeted and relevant. Yet as margins get slimmer and budgets are squeezed, reaching these consumers with targeted messaging without breaking your budget can be a challenge if you’re not careful. How machine learning can help Making the most of your marketing involves making the most of your data. Machine learning can help you identify trends in consumer behavior and interests, which can help you deliver the right marketing messages in the right channels at the right time. ML opportunities Identify which existing bank customers will buy another bank product Score your commercial leads based on risk, profitability, and probability to close ON-DEMAND WEBINAR: Learn how to turn data into insights that drive cross-sell revenue 2. Deepen relationship with customers Digital transformation has affected every business in profound ways, especially in the area of reaching customers and managing the customer relationship. Today’s users want a more seamless experience, more targeted messaging, and on-demand access to information, and they’ll move to the bank that can meet their digital demands. How machine learning can help AI and ML allow you to combine your leadership’s decades of experience with customer engagement data. So not only will you have a gut check of what customers want, you’ll have quantifiable data to back it up. Which means your sales and customer relationship initiatives will ultimately be more effective at targeting customers ready to upsell and at cross-selling more of your products to hungry buyers. ML opportunities Identify high-value customers early and engage with them differently Predict the likelihood of a customer taking their deposits elsewhere Identify which disputed purchases are legitimate Project a customer’s lifetime value for those with a limited history with the bank 3. Reduce Financial Risk Consumers are becoming both more credit averse and less credit worthy, which extra pressure on banks and credit unions of all sizes. On top of that, banks face increased risk caused by data breaches, fraudulent activity, and increased costs brought on by regulatory compliance3 . These challenges make maintaining adequate cash reserves more difficult than ever before at a time of increasing market volatility. How machine learning can help Machine learning can give you the insights needed to reduce your overall financial risk by helping you identify fraud and financial liabilities early – so you make and keep more of your profits. ML opportunities Clarify the liabilities on your balance sheet and determine which are the greatest risks Detect fraud and misuse of the company’s finances Project cash reserves to reduce excess bank cash GET THE USE-CASE WORKBOOK: The ultimate guide to machine learning use cases for banks 4. Optimize investment offerings The investment management arm of today’s banks continues to change rapidly as industry challenges increase. Today’s investment managers deal with increased market volatility, capped organic growth, and increasing fees. Because of these challenges, they struggle to keep up with shifting expectations of clients who demand a better investment turnover. How machine learning can help Machine learning can be used to detect patterns hidden in a bank’s historical investment data combined with external financial data. These patterns produce actionable insights that can increase the accuracy of key investment decisions. ML opportunities Match securities to investors based on trade history and market conditions Dynamically price securities based on competitive offerings, market saturation, and risk profile See how one institution used ML to predict their deposit customers' likely deposits on a daily basis, freeing $40,000,000 in excess cash reserves The up-and-coming (and existing) opportunities for financial institutions to win with ML are staggering. One source estimates that advanced data initiatives like AI and ML are predicted to boost overall business profitability by 38% and generate $14 trillion of additional revenue by 20354. And while it’s true that digitally savvy industry newcomers may take advantage of these trends faster than their legacy peers, legacy banks and credit unions hold something the younger competition doesn’t: mountains of historic data that, when mined for insights using AI and ML, can give them a leg up in retaining customers, increasing their wallet share, and reducing their overall financial risk. To adequately leverage these four opportunities, banks will need to embrace the very technology disrupting the industry. Banks that view their data as one of their most important assets and embrace AI and ML to create new insights will likely see growth, whereas those who don’t will struggle to keep up. 1) 2015 Banking Banana Skins Report 2) 2017 Deloitte 3) 2017 Financial News 4) n.d., Accenture
Executive summary The credit card industry is becoming more complex. Advanced loyalty, targeted offerings, unclear rate conditions, and many other factors can often make it difficult for banks to identify the right customer. Ultimately, the financial services firms that will succeed in this environment will engage the right customers with the right message at the right time. Market leaders will be those who can accurately forecast the revenue and risk for each prospective and existing customer. While the credit card environment has changed, the analytics and modeling techniques have largely remained the same. These models are highly valuable, but do not offer flexibility to evaluate granular and complex customer behaviors incumbent in a financial services firm’s data and other public and private data sets. Machine learning and deep learning (collectively, machine learning) change the paradigm for predictive analytics. In lieu of complex, expensive, and difficult to maintain traditional models, machine learning relies on statistical and artificial intelligence approaches to infer patterns in data, spanning potentially billions of available patterns. These insights, not discoverable with traditional analytics, may empower the financial industry to make higher-value, lower-risk decisions. In this brief article, we discuss three potential opportunities that Fusion expects should add high value to the financial services industry. Advanced analytics for banking Machine learning uncovers patterns in complex data to drive a predictive outcome. This is a natural fit for the banking industry as firms are often working with imperfect information to determine the value of incoming customers. How it works: Traditional models vs. machine learning Credit scorecards represent the basis of most credit card issuance decision making. Whether a firm leverages off-the-shelf models or applies bespoke modeling, Fusion expects the following is representative of a credit scorecard: In the aggregate, these models are highly valuable. But on a per-applicant basis, patterns and details are lost. In machine learning, we can explore detailed and expansive public and private data about segmented applicants for marketing purposes in real time. For example, we can supplement our existing models with data that can be used to segment potential customers such as: Regional FICO trends Educational attainment Social media sentiment analysis Mortgage and equity analysis Much, much more Machine learning can apply artificial neural networks to uncover patterns in your applicants’ history across millions of data points and hundreds of model statistical training generations. When detecting these patterns, machine learning models can uncover risk in approved applicants and value in sub-prime applications. For example, by exploring existing customers, machine learning could potentially reveal that applicants with low FICOs but high educational attainment for a specific city suburb have historically resulted in minimal write-offs. Conversely, a potentially high FICO applicant may have recently moved into a higher-net-worth neighborhood, requiring a high expenditure on a financial institution’s credit lines, resulting in repayment risk. Ultimately, your customer data can tell a far richer story about your customers’ behavior than simple payment history. Machine learning opportunities Financial services firms can gain more insight and capitalize on the benefits of machine learning by applying their marketing dollars towards customers who are more likely to fit within their desired financial portfolio. Lifetime customer value for customer with limited credit data Currently, credit score is determined based on traditional data methods. Traditional data typically means data from a credit bureau, a credit application, or a lender’s own files on an existing customer. One in 10 American consumers has no credit history, according to a 2015 study by the Consumer Financial Protection Bureau (Data Point: Credit Invisibles). The research found that about 26 million American adults have no history with national credit reporting agencies, such as Equifax, Experian and TransUnion. In addition to those so-called credit invisibles, another 19 million have credit reports so limited or out-of-date that they are unscorable. In other words, 45 million American consumers do not have credit scores. Through machine learning models and alternative data (any data that is not directly related to the consumer’s credit behavior), lenders can now directly implement algorithms that assess whether a banking firm should market to the customer segment, thereby assigning customer risk and scores, even to credit invisibles (thin-file or no-file customers). Let’s look at a few sources of alternative data and how useful they are for credit decisions. Telecom/utility/rental data Survey/questionnaire data School transcript data Transaction data – This is typically data on how customers use their credit or debit cards. It can be used to generate a wide range of predictive characteristics Clickstream data – How a customer moves through your website, where they click and how long they take on a page Social network analysis – New technology enables us to map a consumer’s network in two important ways. First, this technology can be used to identify all the files and accounts for a single customer, even if the files have slightly different names or different addresses. This gives you a better understanding of the consumer and their risk. Second, we can identify the individual’s connections with others, such as people in their household. When evaluating a new credit applicant with no or little credit history, the credit ratings of the applicant’s network provide useful information. Whether a bank wants to more efficiently manage current credit customers or take a closer look at the millions of consumers considered unscorable, alternative data sources can provide a 360° view that provides far greater value than traditional credit scoring. Alternate data sets can reveal consumer information that can increase the predictive accuracy of the credit scores of millions of credit prospects. This allows companies to target consumers who may not appear to be desirable because they have been invisible to lenders before, which can lead to a commanding competitive advantage. ON-DEMAND WEBINAR: Learn how to turn data into insights that drive cross-sell revenue Optimizing marketing dollars to target customers Traditional marketing plans for credit card issuers call to onboard as many prime customers that meet the risk profile of the bank. However, new customer acquisition is only one piece of the puzzle. To drive maximum possible profitability, banks can consider not only the volume of customers, but also explore the overall profitability of a customer segment. Once these high-value customer segments are identified, credit card marketers can tailor specific products to these customer segments to deliver high value. Machine learning can assist both in the prediction of total customer value, as well as the clustering of customers based on patterns and behaviors. Identifying high-risk credit card transactions in real time Payments are the most digitalized part of the financial industry, which makes them particularly vulnerable to digital fraudulent activities. The rise of mobile payments and the competition for the best customer experience push banks to reduce the number of verification stages. This leads to lower efficiency of rule-based approaches. The machine learning approach to fraud detection has received a lot of publicity in recent years and shifted industry interest from rule-based fraud detection systems to machine-learning-based solutions. However, there are also understated and hidden events in user behavior that may not be evident but still signal possible fraud. Machine learning allows for creating algorithms that process large datasets with many variables and helps find these hidden correlations between user behavior and the likelihood of fraudulent actions. Another strength of machine learning systems compared to rule-based ones is faster data processing and less manual work. Machine learning can be used in few different areas: Data credibility assessment – Gap analytics help identify missing values in sequences of transactions. Machine learning algorithms can reconcile paper documents and system data, eliminating the human factor. This ensures data credibility by finding gaps in it and verifying personal details via public sources and transactions history. Duplicate transactions identification – Rule-based systems that are used currently constantly fail to distinguish errors or unusual transactions from real fraud. For example, a customer can accidentally push a submission button twice or simply decide to buy twice more goods. The system should differentiate suspicious duplicates from human error. While duplicate testing can be implemented by conventional methods, machine learning approaches will increase accuracy in distinguishing erroneous duplicates from fraud attempts. Identification of account theft, unusual transactions – As the rate of commerce is growing, it’s very important to have a lightning-fast solution to identify fraud. Merchants want results immediately, in microseconds. We can leverage machine learning techniques to achieve that goal with the sort of confidence level needed to approve or decline a transaction. Machine learning can evaluate vast numbers of transactions in real time. It continuously analyzes and processes new data. Moreover, advanced machine learning models, such as neural networks, autonomously update their models to reflect the latest trends, which is much more effective in detecting fraudulent transactions. Summary Bottom line: machine learning can leverage your data to develop patterns and predictions about your customers and applicants. These machine learning models are typically simpler to develop and deploy and may be more efficacious than traditional financial services modeling. These models also enable a more detailed forecast about your customers, allowing you to reduce risk while targeting more profitable customers through their lifetime with your credit card services. Related resources Case study: Machine learning predicts outcomes in financial services Case study: How Donatos uses machine learning to retain customers 5 tips to keep the wealth in your company Fusion Alliance has extensive experience in the financial services industry and serves as a preferred solutions provider for many prominent financial services institutions, including Fortune 500 firms. If you’d like to discuss your organization, let us know.
This article was originally published at the Forbes Communication Council Amazon has generally been considered the standard-bearer for product recommendations, and for good reason. The retail giant utilizes user data on past purchases, browsed-for items, and even what users have recommended to others to generate recommendations. Just think, recommendations likely popped up in the sidebar during your most recent Amazon binge. “People who viewed this product also viewed...” often appears as you scroll. A chatbot may even appear with ideas related to your shopping history. This is conversational marketing at work. Still, these advancements fall short of creating a truly personal experience that can predict and assist buying behavior by having a full view of who the person is — not just their recent search and purchase history. Even common segmentation methods fall short by making assumptions based on age and gender that fail to account for many outlying factors that can be easily discovered. The future of retail will be defined by immersive, conversational experiences that lead to better customer interactions and increased buyer loyalty from brands that stay ahead of the curve. While conversational marketing has taken many companies this far, using conventional conversational marketing techniques, in conjunction with machine learning, can be the answer retailers are looking to create the experience of the future. Online retail: Blending conversational marketing with AI technologies While conversational marketing has become the trend in business-to-business (B2B) demand generation strategy, there exists a huge business-to-consumer (B2C) opportunity, as well. Conversational marketing practices utilize website chat features and chatbots to initiate in-the-moment interactions with customers and build context to quickly qualify them for the appropriate next step. According to David Cancel’s aptly titled book, Conversational Marketing, both baby boomers and millennials are likely to adopt the use of chatbots, with a majority in both groups finding instantaneous responses and quick answers to simple questions being potential benefits. Aside from the obvious advantage of getting answers to product questions, automated chatbots offer a number of opportunities to enhance shopping experiences when coupled with data. Machine learning and chatbots While many B2C companies are already leveraging chatbots to streamline the customer experience, there lies even greater opportunity with machine learning to truly learn from and predict consumer behavior. Today’s practical machine learning models enable rapid iteration of data and deliver quick, reliable data sets. Data collected from customer conversations about the products they research, buy, and use can tell a deeper story about the customer themselves over time. Instead of a static list of recommended products based on their last purchase, machine learning can help us understand the customer’s lifestyle and habits in such a way as to help the customer make the best purchase in the moment. As an example, imagine an on-the-go, seasoned business professional with a love of podcasts and streaming music. Our traveling audiophile is a regular adopter of new headphone technology and is on the hunt for a new pair. While segmented data and previous purchase history might be able to get us in the ballpark when it comes to their next tech purchase, it doesn’t tell the whole story. In fact, the reason for this purchase has nothing to do with a search for the latest technology, but rather because past purchases have missed the mark for this customer’s need for multitasking and call connectivity. In this case, relying on past purchase history or even peer purchasing information won’t help. However, their experience with a chatbot powered by machine learning can give us helpful predictive data that informs the retailer of their need for a balance between audio quality and the ability to quickly and clearly connect to meetings during travel. A few quick questions allow the chatbot to suggest a new pair of headphones to fit their lifestyle, along with helpful content and reviews that match our customer’s pre-purchase research habits. Marrying predictive data to emerging technologies As advances in artificial intelligence (AI) continue to blur the line between human and bot, and retail brands continue to experiment with augmented reality (AR) to replicate brick-and-mortar shopping experiences, it’s vital that data plays a role in the next phase of online shopping. Not only should brands be placing an emphasis on the aesthetic experience that can be delivered through apps infused with AR, but they should also make room for predictive machine learning data to make the buying process even easier for the consumer, making them more likely to return in the future. In fact, for any brand wishing to be at the forefront of the next wave of retail evolution, I believe it’s vital that a data governance framework be in place and actively funnel information to teams developing emerging technology. The days of keeping customer data siloed away from our product teams need to come to an end in order to fully realize the marketplace potential. The future of retail is filled with possibilities that can completely reshape the way we understand consumer behavior and connect with the consumer to meet their needs in real time. Taking tangible steps to listen to our customers, learn from them, and act to predict their needs, while delivering a stellar shopping experience along the way, is more than a possibility — it’s a reality. At Fusion Alliance, we find our place at the intersection of advanced analytics, experience design, and technology, leveraging machine learning to gain customer insights that inform our strategies. Learn more about our approach to machine learning solutions >>
Amazon, Netflix, Airbnb, Uber, and other disruptors have raised the bar on what customers expect from a business. These online giants have figured out how to use their customer data to make personalized recommendations and predict when customers are going to buy — and present offers at just the right time. Brands that use personalization report an average growth of 20% in sales (Monetate research), and customers feel less spammed and more like they’re in control of the experience. It’s no surprise that consumers are looking for that same personalized, frictionless experience when interacting with their financial institutions, whether through mobile banking, your website, at a brick-and-mortar branch, or at one of your ATM locations. And it pays off for banks who can engage their customers. According to a 2013 Gallup study, fully engaged customers bring in an additional $402 in revenue per year to their primary bank, as compared with those who are actively disengaged. Even better, the research said 71% of fully engaged customers believe that they will be customers of their primary bank for the rest of their life. That could be your bank, but only if you can reach your customers in ways that feel natural and valuable to them. Customers want to be engaged with the right messages at the right time Imagine if you could understand your customers so deeply and predict their buying patterns so clearly that you could deliver targeted marketing only to those ready to invest in more products with your bank. Not only that, what if you could know what to say to them and on which channels to reach them? How would that impact your business? The trend is clear: financial institutions must adopt a customer-centric business model now to ensure success in the future. This puts banks like yours at a crossroads, and the problem is where and how to embark on that journey. Tackle your greatest challenges The formula seems simple. Increase your engagement and you’ll increase your revenue. But meanwhile, you’re under pressure to acquire new customers, maintain your base, forecast/reduce risk, manage capital, navigate security compliance and financial regulations, and optimize the business. You may also grapple with siloed data, legacy systems, and outdated processes, all seemingly monumental challenges that may adversely affect your customer experience. For example, your customers and employees may not have access to the right data at the right time to provide an optimal experience. Or, from a marketing standpoint, different departments within your company may be targeting the same customers, resulting in too many emails. Or your customers may get untimely messages about promotions that have passed or receive communications that don’t apply to their current situation. This creates frustration and a poor user experience that may be enough to make your loyal customers turn away. Other banks have been in your shoes, facing the same challenges and fears, but they’ve made major strides in putting the focus on the customer. They’ve found success through the “magic” of machine learning (ML). ML enables your staff to prioritize your over-capacity bankers’ focus and marketing spend on opportunities that are real. ML is a modern technique that uses algorithms to analyze enormous amounts of data. Machine learning models learn on their own and identify insights and patterns to predict future behavior. Machine learning algorithms connect the dots far faster and deeper than people can, exposing patterns in your customers’ behavior that empower your team to take actions that will impact your business’ top and bottom lines. Unlike traditional analytics tools, ML can evaluate account holders, securities, and transactions in real-time. If you want immediate decisions integrated in the moment, machine learning is the answer. And, good news, even though you may feel you are behind the curve right now, you have something that the younger fintechs you compete against don’t − a wealth of historic data that can be “mined” by ML to answer your specific business questions. Some organizations need help in improving the quality of their data for effective use in the machine learning model, and that’s not an uncommon challenge. But good data will be your key to success. Machine learning applications in finance Banks have found many successful ways to leverage machine learning. For example, they use it to answer specific business questions across all departments, including: How do I increase my customer wallet share, including: What are my best opportunities for cross-sell/remarketing my existing customers? Can I identify customers that we can convert from other banking institutions? Can I identify loan-default risk early enough to take an action? Can I dynamically price securities based on investor demand and market saturation? Can I predict my cash and reserve activity to optimize liquidity levels? Can I identify account holders’ attrition activity before they disengage? What percentage rate and product messaging would make my ideal prospect buy? The first step towards engaging customers with the right messages at the right time is to capture what questions your bank wants to solve. With these questions in hand, you can move to the next step, seeing how much predictive value these “use cases” for machine learning will give your financial organization. Case in point, these very questions are how it started for a large, institutional bank sitting on decades of financial transaction data we worked with. They wanted to more accurately predict member activity and drive better returns on cash reserves – and leveraged machine learning to do it. Our machine learning model identified patterns in their transactions, which spanned hundreds of credit unions and billions in cash to predict the deposit activity of millions of credit union members on a daily basis. The result? We freed $40 million in excess cash reserves. The insights gleaned also empowered the organization to pass on greater returns to members by selling short and long-term securities, arbitrage, and reducing borrowing fees. Another institution, Primary Financial Corporation (PFC), found great success using machine learning to improve their sales targeting. PFC wanted to predict CD issuers’ funding needs and institutions’ desires to invest. They developed machine learning models that synthesized PFC’s financial and competitive data to price securities, identify buyers, and project trade profitability. By the time the first phase of the project was complete, PFC could predict with over 80% accuracy and 70% precision the likelihood of a particular investor to buy a given investment. The common thread in these stories is that both organizations had an abundance of historic data at their fingertips, but they hadn’t explored how ML could help them retain more deposits, sell more products, or reduce their financial risks. The rapid predictive insights that machine learning continues to provide to both companies has been game-changing. And both are now exploring other ML applications. Get started Machine learning is widening the gap between banks who embrace it and their competitors who haven’t. If you don’t improve your banking experience, your customers will turn to another bank or even be serviced by a fintech. As you navigate how to become that customer-centric organization you want to be, explore machine learning as a way to get you closer to your customer and see rapid results. Start by coming up with specific questions that your business needs to answer, and take time to learn more about what machine learning can do in your organization. Contact Fusion Alliance to discuss if ML is right for your project. ON-DEMAND WEBINAR: Learn how to turn data into insights that drive cross-sell revenue
Artificial intelligence (AI) and machine learning (ML) have completely transformed mobile development. Mobile app users today are often looking for an easy and relevant user experience — one that has been customized for them. The best way to get there? Machine learning. Machine learning identifies anomalies and patterns that ultimately optimize the user experience. If your technology conversations have stalled at the brainstorming or ideation phase, consider why. If you don’t have a clear answer, you’re not alone there either. “Strategic decision makers across all industries are now grappling with the question of how to effectively proceed with their AI journey,” says Marianne D’Aquila, research manager, IDC Customer Insights & Analysis. Despite questions about how to proceed, organizations know they need to invest in ML for mobile before current competitors, and those waiting in the wings, figure out how to profit from it first. Considering the speed at which machine learning is being adopted and spreading, and its potential to quickly help companies on multiple fronts, the time for execution and implementation is now. Here are the top three reasons that make machine learning development for mobile important right now: 1. Machine learning for mobile increases app security “Facial recognition” ($4.7 billion, 6.0%) and “fraud detection and finance” ($3.1 billion, 3.9%) were among the top five categories of AI global investment in 2019, according to the AI Index 2019 Annual Report (an independent initiative at Stanford University’s Human-Centered Artificial Intelligence Institute). It’s not surprising. From TikTok’s recent security flaws to Target’s $18.5 million settlement, app vulnerabilities and potential data breaches are breaking news, and there are few signs of a slowdown. While the short-term financial impact can hurt, the long-term cost of losing the trust of customers and partners can be even more painful. Companies that receive users’ personal information (e.g., passwords, billing addresses, answers to security questions) for processes such as app authentication or making purchases must continually optimize how the data is used. Through machine learning and automating parts of the process, you can identify anomalies faster, allowing you to see patterns and manage potential weaknesses more quickly. Operationally, ML can detect and staunch security issues related to data inside your company, such as logistics or pricing anomalies, that could be a drain on resources. For example, if one of your products is selling faster than usual via a shopping app, it could be related to a pricing error. Do you really want that $450 device on sale for $4.50? The mobile application landscape is comprised of a wide variety of operation system versions, devices, and software systems. This creates a much greater number of attack surfaces that attackers can target. (A first step to optimizing security is risk evaluation and awareness. Contact Fusion to hear more.) 2. Machine learning leads to increased mobile privacy It could be argued that the recent news cycle around privacy indicates a real desire for clarity, if not outright skepticism. In more than 3,600 global news articles on ethics and AI from mid-2018 to mid-2019, the dominant topics were “framework and guidelines on the ethical use of AI, data privacy, the use of face recognition, algorithm bias, and the role of big tech.” You’ve heard about Russia’s role in the 2016 election and the use of personal information for ad targeting. These sorts of debacles haven’t led consumers to give up on digital. Instead, they are demanding more privacy oversight and are being more cautious about the apps they use. Privacy concerns are complementary to security issues. While security comprises keeping personal data from hackers, trolls, or criminals, privacy is more related to keeping personal data in a person’s own hands, away from any individuals or organizations that don’t need to be privy to it. For example, if you use an activity tracking app to record runs, you might appreciate a note when you hit a milestone: “You had a personal record today!” Machine learning makes it possible for the mobile app to directly detect this activity and send a congratulatory message without any human intervention. There’s no need for a stranger to know you clocked a fast 10K. Machine learning on the edge further increases privacy by eliminating the need for data to be sent to the cloud. When ML on the edge is in place, individualized data never leaves the device, keeping the user’s personal information in their own hands at all times. Amazon, Alexa, and Google Home employ ML on the edge, as some functions are offloaded to a device while others have to go to the cloud. In addition to supporting privacy, the reduced travel time for data makes these apps and devices faster. 3. Machine learning for mobile helps create personalized customer experiences Consumers expect their demographic, behavioral, and other personal data to be secure and private, while they also want increasing levels of personalization. Delivering on these demands can be a delicate, real-time balancing act for companies, but machine learning helps make it possible to juggle data acquisition with protection and those prickly questions around how to use the data to everyone’s advantage. But is there a clear business case to pursue personalization? According to a 2019 Salesforce report, the answer is yes, as 75% of 8,000 consumers and business buyers surveyed expect companies to use new technologies to create better experiences. Machine learning for mobile enables you to make user-experience headway on several fronts. First, it can help you build a baseline of customer app usage. Once you have that baseline, you can see patterns in user behavior. Next, particular behaviors or deviations from the baseline can trigger delivery of a relevant coupon, suggested product to explore, or a reminder to revisit an abandoned shopping cart. Even more sophisticated, ML can serve up colors, screen layouts, and language that appeal most to a particular user. And with machine learning, the reactions are in real-time. The more your user engages with your mobile app, the more refined and personalized the experience becomes. Through machine learning, your brand becomes more closely aligned with the customer experience that your customer desires. Getting started can feel uncomfortable at first, but at Fusion, we’ve found that organizations often have low-hanging fruit ripe to benefit from machine learning for mobile. You just need to be able to see and then act on those opportunities. Working alongside you on this journey should be people who understand data science and machine learning, and who can uncover weaknesses to target. Now is the time to move forward on machine learning for mobile initiatives. Current market conditions indicate a shortage of professionals in machine learning and data science. Fusion fills this gap. If you’re interested in hearing more about machine learning for mobile, let us connect you with one of our experts.
The B2B and B2C markets are abuzz with the terms artificial intelligence, machine learning, and deep learning. But what do these terms mean, exactly? They’re often used very loosely and you may think they’re interchangeable. But they’re not. Here’s a short overview of artificial intelligence, machine learning, and deep learning to help you cut through the static to determine which solution is right for you and your business. Working definitions Artificial intelligence (AI) is the term for the broad discipline that includes anything related to developing machines that are “intelligent” through programming. This includes many daily items you’re familiar with, from smartphones and marketing software to chatbots and virtual assistants. Machine learning (ML) refers to machines and systems that can learn from “experience” supplied by data and algorithms. ML is often used interchangeably with AI, but it’s not the same thing — ML is a developmental outgrowth of AI. Deep learning (DL) is a further developmental outgrowth of ML, but applied to even larger data sets. It uses multi-layered artificial neural networks to deliver high accuracy in assigned tasks. In terms of historical development, AI came first. It serves as the foundational discipline from which ML evolved. And ML is the foundational discipline from which DL evolved. One way to conceptualize their relationship to one another is as nested arenas of AI development along a timeline: Artificial intelligence overview and how it works In its broadest sense, AI refers to machines programmed to act according to well-defined rules and responses. The responses are confined to the set of rules that are provided, and the machines can’t deviate from those rules, except if they fail. A very basic example of AI would be your clothes dryer. You can set a specific time and temperature, and the machine performs the task according to the instructions given. It doesn’t have the ability to make decisions or make any changes by itself. A more sophisticated example would be configuring your CMS to deliver personalized website experiences. By analyzing a targeted selection of data points about your customer and writing the appropriate logic, your website can display the most relevant content. In neither case is the machine capable of being more than its programming — even if that programming makes the machine very capable in accomplishing its assigned tasks. Machine learning overview and how it works “ML is the science of getting computers to act without being explicitly programmed.” Stanford University Machine learning is a different approach to developing artificial intelligence. Instead of hand-coding a specific set of rules to accomplish a particular task, in ML the machine is “trained” using large amounts of data and algorithms that give it the ability to learn how to perform a task. Over the years, the algorithmic approaches within ML have included and evolved from decision tree learning, inductive logic programming, linear or logistic regressions, clustering, reinforcement learning, and Bayesian networks. Currently, there are three general models of learning used in machine learning: Supervised learning Unsupervised learning Reinforcement learning Supervised learning Right now, most machine learning is supervised, which still requires a lot of human intervention to accomplish the training. For example, a “supervisor” has to manually tell the spam filter what to look for in spam vs. non-spam messages (e.g. look for the words “Western Union” or look for links to suspicious websites, etc.) until the machine has gained enough “experience” to learn and accurately apply the distinctions. The training goes something like this. The algorithm would be first trained with an available input data set of millions of emails that are already tagged with the spam/not spam classifications to train the ML system on the characteristics or parameters of the ‘spam’ email and distinguish it from those of ‘not spam’ emails. Unsupervised learning In general, unsupervised learning is more difficult to implement than supervised learning. But it’s very useful when your example data set has no known answers and you’re searching for hidden patterns. The system has to train itself from the data set provided. Two popular types of unsupervised learning are clustering and association. Clustering groups similar things together and consists of dividing a set of elements of the existing data set into groups according to a heretofore unknown pattern. For example, defining a customer demographic and further clustering them based on education or income might affect their purchasing decisions in favor of one product or another. This would allow you to specifically target each cluster of customers more effectively. Association involves uncovering the exact rules that will describe the larger portions of your data. For instance, “People who buy X also tend to buy Y.” For example, online book or movie recommendations are based on the association rules uncovered from your previous purchases or searches. Association algorithms are also used for purchasing-cart analysis. Given enough carts, the association technique can help predict another item you might like to put into your cart. Reinforcement learning The machine learning system learns by trial and error through training characterized by receiving virtual rewards or punishments. Reinforcement learning comes from child-development research that says instead of telling a child which piece of clothing to put into which drawer, you reward a child with a smile when the child makes the right choice on their own or you make a sad face when the child makes the wrong choice. After just a few iterations a child learns which clothes need to go into which drawer. In reinforcement learning, very complex algorithms are designed so that the machine tries to find an optimal solution. It operates according to the principle of reward and punishment, and by this approach, it moves quickly through several mistakes or near mistakes to the correct result by adjusting the weight of the previous results against the desired outcome. This allows the machine to make a different, better decision each time until it is rewarded. Deep learning overview and how it works Deep learning is the newest area of ML and AI that uses multi-layered artificial neural networks to accomplish tasks such as object detection, speech recognition, and language translation − all with an extremely high degree of accuracy. The artificial neural networks (ANN) are inspired by the biology of the human brain, specifically the organic interconnections between neurons. The human brain analyzes information it receives and identifies it via neuron connections according to past information it has stored in memory. The brain does this by labeling and assigning information to various groups, and it does this in nanoseconds. Similarly, when a system receives an input, the deep learning algorithms train the artificial neurons to identify patterns and classify information to produce the desired output. But, unlike the human brain, artificial neural networks operate via discrete layers, connections, and directions of data propagation. Despite the level of sophistication of its algorithms, DL is still just another method of statistical learning that extracts features or attributes from raw data sets. The major difference between deep learning and machine learning is that in the latter you need to provide the features manually. DL algorithms, on the other hand, automatically extract features for classification. This ability requires a huge amount of data to train the algorithms. The accuracy of the output depends on the amount of data, and deep learning requires huge data sets. Additionally, due to the sophisticated algorithms, deep learning requires very powerful computational resources. These are specially designed, usually cloud-based computers with high-performance CPUs or GPUs. There are several kinds of artificial neural networks and DL processing applications you may have already heard of: Convolutional neural networks (CNN) are deep artificial neural networks that are used to classify images, cluster them by similarity, and perform object recognition. These are algorithms that can identify faces, tumors, and navigate self-driving cars. Source: TowardsDataScience Generative adversarial networks (GAN) are composed of two neural networks: a generative network and a discriminative network. GANs are very popular in social media. If you feed the GAN with a large enough data set of faces, it can create completely new faces that are very realistic but nevertheless fake. Natural language processing (NLP) is the ability to analyze, understand, and generate human language, whether text or speech. Alexa, Siri, Cortana, and Google Assistant all use NLP engines. Putting AI, ML, and DL to work for you What most of us think of as AI is, more accurately, machine learning. But understanding the history, development, and distinctions between artificial intelligence, machine learning, and deep learning can help you determine which solution would be right for your goals. The solution you choose, however, is also dependent on the amount and type of data you have access to. Within the last couple of years, almost every company is using machine learning or deep learning (and therefore, by definition, artificial intelligence) in some capacity to move their business forward. The competitive gauntlet has been thrown down. Fortunately, tools that were previously only available to enterprise-size companies are now affordable and accessible to mid-market companies, making machine learning the most accessible playground right now. Fusion Alliance provides cloud infrastructure and other ML services that accelerate machine learning modeling, training, and testing to our banking, financial, and retail customers. Read more about our work in AI, ML, and DL: 3 fusion experts share their machine learning secrets Unlock customer credit insights with machine learning How Donatos uses machine learning to retain customers Conversational marketing and machine learning are shaping the future of retail
In the quest to solve its most pressing challenges, the banking industry is being transformed by its adoption of artificial intelligence (AI) and machine learning (ML). Financial institutions are under pressure to better understand their customers, drive a more personalized customer experience, acquire new business, forecast risk, prevent fraud, comply with increasing regulations, improve processes . . . the list goes on and on. Most banks continue to use traditional, expensive analytics tools to tackle these challenges, but they struggle to keep pace with demands, and the tools are difficult to maintain. Machine learning relies on statistical and artificial intelligence approaches to rapidly uncover patterns in complex data, patterns that can’t be discovered through traditional tools. The impact of machine learning in banking While adoption of machine learning in finance is in the early stages, institutions who have leveraged this secret sauce are finding it to be a differentiator. For example, a large regional bank leveraged ML to predict institutional customers’ likely deposits on a daily basis, freeing $40 million in excess cash reserves. Another institution, credit union service organization Primary Financial Company, used ML to synthesize financial and competitive data to price securities, identify buyers, and project trade profitability. PFC can now ascertain with over 80% accuracy and 70% precision the likelihood of a particular investor to buy a given investment. For these companies, their early ventures into ML have certainly moved the needle on what they can accomplish. We spoke with three artificial intelligence and machine learning experts at Fusion Alliance to tap into their experience with banks, learn where the market is headed, and get answers to some common questions. Q: What do you see as the 2020 trends in machine learning for banks and credit unions? A – John Dages: 2020 is the year where we see machine learning become more democratized. Historically, machine learning engagements have required substantial data science and model training investments. However, the major ML platforms are evolving and providing advanced automated machine learning and feature analysis toolchains, lowering the barrier of entry for ML projects. Our team is also actively monitoring new “explainability” techniques to add deeper transparency for ML-based predictions and insights. Historically, the black-box nature of some ML algorithms (specifically deep neural networks) makes it difficult to relate to business principles. Ideally, these emerging techniques will increase confidence in ML models early in their lifecycle. In the banking sector, we have seen a great deal of capital chase trading and investments, but we are also seeing ML flow into loan operations, cash management, and general risk. A – Sajith Wanigasinghe: Machine learning applied to fraud detection is a major trend. Artificial intelligence is beneficial here because ML algorithms can analyze millions of data points to detect fraudulent transactions that would tend to go unnoticed by humans. At the same time, ML helps improve the precision of real-time approvals and reduces the number of false rejections Another leading trend is using robo advisors for portfolio management. Robo advisors are algorithms built to calibrate a financial portfolio to the user’s goals and risk tolerance. Chatbots and robo advisors powered by natural language processing (NLP) and ML algorithms have become powerful tools with which to provide a personalized, conversational, and natural experience to users in different domains. A – Patrick Carfrey: Personalized delivery of banking services is going to improve in 2020. New products are entering the marketplace that enable consumer and commercial bank customers to receive relevant account information in real-time, at the grain and timeliness that customers want. Q: What is your favorite machine learning use case for banks right now? A – John Dages: Machine learning will change the way banks see credit risk. FICO and the five C’s of credit are limited in features, captive to three agencies, potentially biased, and outmoded. The models we are building will allow lenders to view a complete picture of a borrower, offering customized predictions on creditworthiness. The banks that adopt this model will see an increase in lending opportunities while better understanding the liabilities on the balance sheet. A – Sajith Wanigasinghe: Customer lifetime is my favorite use case, where we can predict how valuable would a customer be within X number of years so that the bank can establish a good relationship with the customer in the early stages. A – Patrick Carfrey: Remarketing/cross-selling is a powerful option for banks right now. Given all the customer data that banks own, including deposits, transactions, and more, ML can tell if a customer is a good target for a new product in the bank’s portfolio. This is especially relevant as customers are expecting more. Being able to predict customer needs supports that need. Related Article: 4 ways banks can leverage the power of machine learning Q: What is the one machine learning data tool you can’t live without? A – John Dages: Excel. Sure, the enterprise data tools are highly capable (and the team spends a lot of time there), but the ability to quickly navigate data, perform simple transforms, and share data with a tool everyone knows is critical. I can’t remember a project where we didn’t get exemptions to install Excel in the banks’ datacenters. A – Sajith Wanigasinghe: TensorFlow framework would be one of the tools that I can’t live without because it’s the number one framework that I use every day and in 99% of our projects. TensorFlow is an open-source machine learning library which helps you to develop your ML models. The Google team developed it, and it has a flexible scheme of tools, libraries, and resources that allow me to build and deploy machine learning applications. A – Patrick Carfrey: TensorBoard. This is TensorFlow’s visualization toolkit, and it provides a nice visual interface for tracing key metrics through the model training pipeline. Deep learning models can get complex quickly, and being able to explore a model outside of the command line is nice. Clients love the graphs, too! Q: What are the biggest machine learning myths you wish more people understood? A – John Dages: For those beginning to develop an AI/ML center of excellence, there is going to be a gravity to focus on the cutting edge (deep learning, cognitive science, others). While there is obviously value there, there are a multitude of “traditional” machine learning practices and algorithms and that are lower complexity. A deep neural network should be the last resort, not the first option! A – Sajith Wanigasinghe: That machine learning and AI will replace humans. In fact, machine learning and AI will help you do your job much faster and better, and enable you to focus on the satisfying and important human elements of your role — including creativity and strategy. Think of machine learning and AI in terms of a tool, not a replacement for humans. A – Patrick Carfrey: For every machine learning project I’ve delivered, our clients will inevitably ask “We love the model, but can you tell us more about how the model is making the predictions?” This is a surprisingly challenging question to answer, particularly for black-box neural networks. Fusion has a variety of techniques to provide additional details, but they aren’t necessarily directly correlated to the actual model we’ve developed. If it is insights you seek, not decisions, consider business intelligence tools and processes in lieu of machine learning. There is room for both! Meet our panel of experts: John Dages With 15+ years of technology leadership experience, John brings a unique perspective to companies on their advanced analytics journey. He led numerous machine learning initiatives for large enterprises across industries. Those projects range from customer acquisition and retention to securities pricing and trade analytics. John’s background in application development, analytics, systems integration, and I&O helps him formulate how businesses can use data to drive competitive advantage and engineer true intellectual property. Sajith Wanigasinghe Sajith is an expert in machine learning, artificial intelligence, and enterprise-wide, web-based application development. He applies his experience and insights to help enterprises identify and solve challenges across the business that are ideal for machine learning. Sajith led teams that have revolutionized the financial, insurance, food, and retail industries by introducing advanced, intelligent forecasting systems that are powered by machine learning and artificial intelligence. He holds a B.S. in computer science from Franklin University. Patrick Carfrey Patrick joined Fusion Alliance over six years ago, leading a variety of application development initiatives for a flagship Fortune 500 client. Patrick is a firm believer that software is social, choosing to spend as much time in front of end users to build the best possible product. In that capacity, Patrick has developed and deployed practical machine learning solutions to help better understand and predict customer behavior to drive maximum engagement. He is the Competency Lead of Java at Fusion and holds a B.S. in computer science and engineering from The Ohio State University.
Ready to talk?
Let us know how we can help you out, and one of our experts will be in touch right away.