Collaborate and manage your workflows smarter, faster, better with Smartsheet. There are many project management tools available to dive into and it can be a little overwhelming to understand if or which tool will really get the job done. While Smartsheet is a tool that is spreadsheet-based, it’s a very complex project management tool that is very powerful that will result in almost immediate ROI. Exploring Smartsheet has many benefits to ensure projects from end to end are successful and efficient within an organization. Let’s dive into the many capabilities Smartsheet has that will drive success into your next project. Integration and Collaboration With many teams still working remotely, having a productive collaboration and project management tool is key to ensure tasks and milestones are completed efficiently throughout projects. One of the primary benefits Smartsheet offers to users is its ability to share sheets with internal or external team members. Even if some users don’t hold a license, users can still collaborate which is definitely a cost-effective approach when you have multiple projects going on simultaneously with many decision-makers who need access. The access level can also be controlled by administrators which is a critical feature if you have a lot of people involved, but not everyone should necessarily have editable permissions to shared sheets. Smartsheet also has many connectors with key tools users already work with such as Microsoft Dynamics 365, Salesforce or Jira. For example, by using a connector with Microsoft Dynamics 365, users can create workflows and apply filters that control the rows that sync to objects in Dynamics 365 for the purpose of including data only on certain rows with a particular status or value. Workflows also have the ability to sync automatically to keep records organized and updated. As connectors are put into place, it’s critical to also utilize and review user permissions to ensure syncing is working properly and that users are constantly saving changes so only the latest version is viewed on sheets. See an issue? A records history is also kept to review any changes that have been made and the user associated with that workflow so it’s easy to identify the source of any errors. There’s a number of other tools available that allow users to collaborate even further and connect existing databases, CRMs and ERPs with Smartsheet’s Data Shuttle. Data Shuttle is an add-on that allows users to automatically import data from software systems and databases directly into Smartsheet such as actual hours worked from external time tracking tools or expenses from external accounting systems. Data Analytics and Reporting Smartsheet’s utilization reporting is another benefit to managing all of the resources involved in projects. One of the key factors to consider in project management is to gather information regarding inefficiencies or who is doing what for each project. Key stakeholders usually always want to know how much time a project is going to take and the time it’s taking for each step along the way and Smartsheet takes the guesswork out of hours incurred for projects and forecasting. Gone are the days where key stakeholders are making decisions just based on opinion. Data should be backing every decision along the way in order to move forward successfully. If you were to ask a project manager to estimate how long it will take their team to complete a project, they will likely underestimate the actual amount of time it will take which leads to a lot of frustration and a prolonged time to market. Once that real-time data is being collected in Smartsheet, visual dashboards can also be created to visualize a project’s performance when considering various metrics and depending on who is viewing that data. Custom reports can be created to filter out the critical data points needed for each group within an organization. Data visualization is a very useful tool to take advantage of to easily identify issues earlier for preventive measures and to easily translate data into a language that can be easily understood among different groups. Improve Operational Efficiencies With the previously mentioned benefits Smartsheet offers to members within an organization, the benefit of improving and streamlining operations is a given. With the data analytics and tracking of resources involved with a project management tool like this, operational efficiencies will occur organically as long as that data that’s being collected is acted upon. The insights provided with project management tools is in a way priceless when it comes to being able to pivot where necessary when reviewing the reporting provided to support business growth and to avoid issues that before would not have been seen without understanding and bringing all of this data together on one shared sheet or dashboard. To clearly see inefficiencies in any project will allow stakeholders to make critical decisions quickly to decrease time to market instead of prolonging projects and missing deadlines. A comprehensive project management tool should not be seen as a luxury, but a necessity at a time especially when many teams are likely to still be working remotely and to keep track of how critical resources are spending their time and how long tasks are taking. The survey results for PMI’s Pulse of the Profession® revealed an average 11.4 percent of investment is wasted due to poor project performance. And organizations that undervalue project management as a strategic competency for driving change report an average of 67 percent more of their projects failing outright. A project management tool is the solution to eliminating poor project performance and management and organizations can’t see these valuable tools as optional – it’s a critical asset to the business to ensure your business still has a pulse and isn’t on the fast-track to fail. Many organizations especially now work quite lean with resources doing more than one job who have a whole portfolio of ongoing projects going on at any given time. Smartsheet not only give insights to key stakeholders, but it gives your teams the tools to help manage their own time and tasks to avoid micromanaging. Smartsheet is a dynamic tool for various industries to increase output and work more efficiently internally and externally. Multi-layered Security It’s always great to know that a software tool you’re planning on using has access to all of your organization’s databases and how they can talk to each other seamlessly right? Well, for some, especially IT, that may raise some red flags when it comes to possibly increasing the risk of a data breach. With Smartsheets, security is built into the product to ensure that the organization’s data is being protected with multi-layered access permissions and quarterly access audits. Smartsheet also has policies and procedures in place to ensure an organization’s data is backed up in multiple locations with a team constantly evaluating security threats and implementing countermeasures to help prevent unauthorized access or downtime. Among Smartsheet’s multi-layered data security strategy, it also uses encryption to protect data with NIST approved ciphers. According to Statista, reports indicate that there were just over 1,000 data breaches and millions of records exposed in the U.S. last year. When looking at a tool that has access to this much data, an organization should ensure that they are working with a comprehensive data protection strategy that is nothing less than the gold-standard. Not sure where to start? A Smartsheet platinum partner like New Era Technology can help guide your organization through implementing this powerful tool throughout the business and removing any apprehension that comes along with making that first step. The only regret there might be is that this tool wasn’t implemented from day one.
Wish You Could Call Miss Cleo to See the Future?
Predictive models tell you more than any tarot reading “Call me now for your free tarot reading!” If you were ever up late watching television in the late 90’s, you remember Miss Cleo, the woman who claimed she could see the future and tell your fortune with an over-the-phone tarot reading. You probably read the opening line in her dramatic Jamaican accent and see her with her crystal ball. Of course, it was “for entertainment only,” but that didn’t stop thousands of people from calling in, hoping for a glimpse of the future to help them make tough decisions. Wouldn't it be great if you could see what the future holds for your business so you can make smart decisions and build strategies that will support upcoming trends and mitigate risk? But if you can’t call Miss Cleo, what can you do? You look into the past. And no, we’re not talking about holding a séance. We're talking about leveraging your data with machine learning. Machine learning analyzes your data to look for trends and patterns that can be used to develop predictive models to forecast economic trends, customer behavior, and marketing campaign success. Learn More: Modern data platforms support machine learning to develop predictive models Get Smart: Want to learn more about the rise and fall of the self-proclaimed psychic? Check out Call Me Miss Cleo, a new documentary on HBO Max that provides an illuminating, in-depth look at the woman, the empire that was built around her, and her eventual downfall.
Using Microsoft’s Orleans actor framework for massive scalability
In enterprise organizations, legacy systems aren’t easy to overthrow – or to rearchitect. When you’re facing a system that has grown in size and complexity and that has you locked in for certain types of functionality but hamstrung in scalability, you need: An unlimited budget An unlimited timeline, OR An actor framework If options A and B aren’t in the cards right now, you’re not alone. And if you’re looking into actor frameworks for new solutions to legacy system modernization and scalability, you’re in good company there, too. The challenge of scaling cloud software The cloud makes it easy to scale hardware, allowing business and UI logic to execute on simultaneous servers to any degree you need or want, but writing software at that scale is much more difficult. Many common strategies attempt to solve that problem by increasing the scalability of the data layer, but each of these approaches introduces added complexity and requires significant effort to implement. For example, implementing caching patterns can reduce the load on the data layer, speeding time to scale. However, this tactic requires significant application recoding and introduces significant problems such as determining when and how to invalidate the distributed cache. In some limited circumstances, an organization may be able to use a NoSQL database such as Mongo DB to streamline scalability in the cloud. But even if a NoSQL database is an option – and for many organizations, it is not – transitioning from a traditional datastore to a non-relational database is a large undertaking, requiring the organization to take on significant risk and expend substantial effort. Enter the Orleans actor framework The Orleans actor framework solves the cloud scalability problem in a much more robust and elegant way. Although Orleans could replace the data layer entirely, it’s more often used to add a layer of scalability to the business logic that overlays the data layer, reducing pressure without adding complexity. Orleans allows the data layer to exist in different states and have persistence, which delivers better performance than caching or stateless business logic in a cleaner, more streamlined manner. And, like NoSQL databases, Orleans can also be massively distributed to as many servers and clusters as needed. If you’re a gamer, you might be familiar with the Orleans actor framework. Developed by Microsoft to support massive multiplayer online (MMO) games like Halo 4 and Halo 5, Orleans has recently gained popularity as a cross-platform framework for interactive applications in enterprise settings. Because our team has a strong background in actor frameworks, we jumped into the vanguard of this emerging opportunity and were eager to test it out. In our experience, modernizing a legacy system at scale often comes at a price – and the cost of maintaining service when multiple components are subject to failure is usually a factor. One of the first things we noticed about Orleans, versus other actor frameworks like AKKA.NET or Service Fabric, was the unusual combination of scalability and fault tolerance. If you think about Orleans as an actor framework designed to support millions of people engaged in multi-player games at once, it makes sense that sophisticated failover processes were baked in. Overall, the fact that Orleans had been deployed and proven in that massive, multi-player context gave our team a high level of confidence in suggesting the framework for our business clients. Benefits of implementing an actor framework designed for MMO In 2022, Fusion’s technology team started putting Orleans into practice at large-scale organizations looking to modernize legacy monoliths with great success. Although it’s not perfect for every use case, we’ve seen significant success in several key areas: Observability In traditional web-based systems, a user inputs information, and a page refreshes, while behind the scenes, data is processed, databases are updated, and new outputs are pushed to the interface. Observability streamlines that process to deliver outputs in real time. This provides a fluid user experience and provides an elegant workflow – even at full scale. In Orleans, actors – called “grains” – have built-in observability, allowing the grains to trigger events and run logic when data changes. Once an Orleans grain is activated, the grain lives in memory with business logic and data in the same location, allowing it to perform at incredibly low latency and with blazingly fast throughput. Orleans grains can be persisted to relational or non-relational databases, further enhancing scalability. This design feature makes Orleans a great addition to asynchronous, event-driven architectures. Conversational UI Orleans also supports a fully responsive and interactive user interface. Using Angular and Signal R to facilitate call and return through established databases and pipelines, we can implement a truly conversational UI. With applications for streamlining internal business processes and enhancing customer experience in public-facing use cases, a conversational UI delivers a cutting-edge experience and enables a fully scalable, customizable way to put the user at the center of any process. Lightning speed As an actor framework, Orleans makes it easy – and incredibly fast – to support large numbers of processes simultaneously. Again, the fact that Orleans was developed as a gaming framework serves as a benefit! Each unit of Orleans is an actor in the system, updating the internal rules and model. This streamlines processes by removing questions that are not needed by running logic. Using Orleans with advanced monitoring and IaC While an actor framework like Orleans can deliver a major impact, it’s not a stand-alone modernization solution. In implementing Orleans across enterprise architectures, we’ve found that advanced performance monitoring and infrastructure as code provide considerable companion benefits. Orleans + monitoring To lock in the scalability and fault tolerance Orleans promises, you’ll need to pair it with an advanced monitoring process. Several vendors offer monitoring solutions that support tiered responses in an unknown number of instances for a fully scalable environment in multiple dimensions. Look for an option that is highly responsive and has capabilities that can address your edge cases. Orleans + IaC When you’re looking at agile performance at a massive scale, infrastructure as code (IaC) delivers the speed and consistency you need. By managing and provisioning infrastructure through easily editable and distributable scripts and configuration files, IaC reduces manual processing steps and cuts down on slack in the process. Figuring out the future of modern app dev Stepping into emerging solutions and applying technology in new ways is what Fusion’s team of experienced full-stack technical architects do best. From layering a state-of-the-art UX over an aging system so the modernization process is seamless to the end customer to repositioning technology to support evolving business processes, our team is at the forefront of modern app dev. We stay on top of what’s new and what’s next so that our enterprise clients don’t have to chase buzzwords. Actor frameworks like Orleans deliver exciting results, but they aren’t for everyone. Our team can answer your questions, help you unpack your challenges, and suggest the right path forward. Let us know what you need or set up a time to talk. Connect with our technology team >>
Who Is Your New Technology Benefitting?
Predictive models tell you more than any tarot reading There’s a rapidly growing space in tech that doesn’t have a name, but you know it when you see it. We call it “Tech nobody asked for.” Tech nobody asked for is typically modern technology, such as a mobile app, device, or software that doesn’t solve a problem or improve an experience. Often, it creates additional steps or complications. Here are a few examples: Internet-connected hairbrush to “listen” to hair and recommend products Bluetooth water bottle that syncs to a “hydration app” Smart egg tray that syncs to your phone to tell you how many eggs you have and how fresh they are These may be extreme examples, but there are near-infinite examples. Have you been to a restaurant that got rid of printed menus and now requires you to scan a QR code or download a mobile app to see the menu? No one wants a QR code menu. Your business wants to solve customer problems and offer the best experience. Providing a mobile app or other new tech solution seems to be the right answer, but you want to avoid investing in and launching tech nobody asked for. That’s where our technology and marketing teams can help. We identify customer concerns, including obstacles in the buyer journey or gaps in customer experience, and help you build out tech solutions to overcome those issues, including mobile applications or website improvements. Check out this case study to see how Fusion did exactly that. >> If you want to learn more about custom, customer-friendly technology solutions, we can help you get started. Want to see more examples of tech nobody asked for? MIT Technology Review released their picks for the 10 worst technologies of the 21st century. Is there anything on the list you disagree with?
Stick with Your New Year’s (Marketing) Resolution
Stick with Your New Year’s (Marketing) Resolution Over 40% of Americans make New Year’s Resolutions each year, and of those, fewer than 10% stick with it. Honestly, most people give up before Punxatawney Phil makes his annual appearance. Why is it so hard to stick with a resolution? Usually, it’s because the steps to success look something like this: Make a resolution: “I’m going to get in shape!” Invest in equipment to support the resolution: “I bought an exercise bike!” Do an activity to support the resolution: “I’m waking up 30 minutes early every day to exercise!” Fall off the wagon once. “I hit snooze and well…attempts were made!” Use an exercise bike to hang sweaters. (For those of you who have clothes hanging from your workout equipment, you may be feeling personally attacked right now, and we apologize for that.) To stick with a resolution, you need clear goals, a strategy to meet them, support to guide you over obstacles and challenges, and measurable results to track progress. If your team’s New Year’s Resolution is creating a successful content strategy this year, it will take more than declaring, “This is the year we get good at content!” and purchasing the latest, greatest marketing software to make it happen. It takes a clear, consistent message and customer insight into the buyer’s journey. Plus, you need to know the right channels to publish content and follow a set publishing schedule. It sounds like a lot. Because it is. (Which is why so many companies struggle to stick with it!) But when you have the right support in place, such as a team of experts who can work with you to build a content strategy that aligns with your brand and resonates with your audience, this is one New Year’s Resolution that is sure to be a success. Check out how we supported a brand and content marketing strategy for a pharmaceutical tech company: and tripled leads over six months >>
Elevating your digital footprint (tap dancing optional)
How to make websites more spirited As a modern organization on the path to digital transformation, you probably spend a lot of time thinking about how to keep your brand in line with shifting customer expectations. But, if you’re like most organizations, your website may not be keeping up. Is your digital footprint giving your target audience brand whiplash? To paraphrase Will Farrell and Ryan Reynolds, no website is truly unredeemable, but it might take some serious choreography to make it accurately reflect the heart of your business. One of our clients, a cutting-edge technology brand serving the pharmaceutical industry, ran into this problem earlier this year. They were growing in hyperscale, but their website didn’t match who they were becoming from a visual branding or a core messaging standpoint. We had been a marketing partner for the company’s product launches, so when they needed new website content to support an ABM initiative, we helped them redesign, rewrite, and transform the experience of their website – in just four months. Check out the results >> Maybe it’s time to stop tap dancing around the issues you’re having with your website. If you’re interested in elevating your digital footprint by redesigning, replatforming, or making incremental changes to your website and mobile experiences, we can help. Make your digital footprint more spirited >>
Are modern data solutions on your gratitude list?
Are modern data platforms on your gratitude list? When you’re dealing with an inflexible, monolithic technical architecture, getting the right information at the right time is like trying to cook a traditional Thanksgiving feast in a microwave. You need better tools for the job. Thankfully, modern data solutions like data mesh frameworks can help. Using a data mesh distributes information across autonomous domains, allowing business users to own, manage, and share their data in a separate virtual environment, while governance remains centralized. It’s the data equivalent of asking your cousins to make the side dishes while you handle the turkey and set the table. Interested in figuring out your options for modernizing and democratizing your data frameworks? Check out our latest deep dive on modern data solutions. Your data consumers will be grateful! Get your extra helping of modern data trends >>
5 Imperatives of a modern data platform
Accessing data at the speed of business is critical to remaining competitive in a digital-first world. But if you’re relying on outdated architecture where your data is trapped in silos or lost in a data lake, access to the functional data you need is seriously limited. When your existing framework is no longer serving your business, it makes sense to transition to a modern data platform, but you may have hesitations about whether it can help you succeed. To help you better understand this solution and what you need to gain from it, we are looking at data platform capabilities and sharing five modern data platform imperatives that will help achieve a more logical data management system. What is a modern data platform? With so many emerging data solutions, we understand that data is a very complicated environment, so we want to start by clearly defining what a modern data platform is and its capabilities. A modern data platform is a flexible, cloud-based, end-to-end data architecture that supports collecting, processing, analyzing, and delivering data to the end user in a way that is aligned and responsive to the needs of the business. On the surface, aside from it being cloud-based rather than on-premise, modern data platform capabilities aren’t different from traditional data architecture. The difference is in how new technologies have expanded their capabilities. Here are some of the ways modern data platforms can deliver more for your organization: Data ingestion Bringing new data into the environment is the first step to managing data, and in a legacy architecture, that is mainly done through batch processing. Batching collects and processes data at specific time periods or intervals. By leveraging the higher computing capacity of a cloud-based architecture, data can be streamed in real time to data storage units, eliminating bottlenecks and delays to keep data moving through the system in a more fluid manner. Quality and governance With AI integrated into the architecture, data quality and governance tools can be automated, speeding up how new data sources are analyzed, categorized, and assessed for security concerns. Security Security measures can be integrated at the base level for new data products, providing inherent encryption whether it’s at rest or in transit. Within a modern data platform, security measures are implemented to dynamically filter and obscure data as needed to support your organization’s security policies. Storage Cloud-based architecture offers the potential for nearly unlimited storage and offers a pay-as-you-go model, so you only need to invest in the volume of storage you need today. As your data storage needs increase in the future, you can add and seamlessly integrate additional space without creating silos for new data. Transformation In legacy architecture, transformations such as quality adjustments and business logic need to be applied in the early stages of data flow during large batch processing. While this ensures that the downstream usage of the data is more performant, it also locks the business rules in place which removes flexibility in how the business looks at and interacts with the data. The expanded computing power and advanced tools in a modern data platform offer a more flexible timeline to add transformations to the data. Business rules and logic can be applied later in the data flow and adapted to suit changing needs. Discovery Data discovery is streamlined through integrated tools within a modern data platform that can automatically scan, categorize metadata, and organize it so the most appropriate data is accessed more easily and quickly. Delivery In a legacy architecture, data delivery visualization tools required the data to be specifically structured prior to business usage, whether for reporting, data extracts, or API access. Now, visualization tools have advanced features that support access to semi-structured and unstructured data without the need for intensive (and expensive) data processing. Integrated tools simplify both data extraction and data sharing and have built-in security and monetization features. DevOps and DataOps In a modern data platform, DevOps/DataOps are cross-platform and cross-language supportive, which makes it easier and faster to coordinate development and release implementation tasks when architectures are built using multiple tools. 5 modern data platform imperatives The overall framework, capabilities, and patterns of managing data are universal within a modern data platform. However, no two platforms are the same. Each one is highly customized to support the data and data needs of the organization and require different combinations of tools or features to achieve specific functionalities and cover the needed capabilities. You still need to ensure your platform manages the data in a way that aligns to your organization’s unique needs, and this means that five modern data platform imperatives must be met. 1. Greater flexibility The greatest challenge of legacy data architecture is the lack of flexibility. The physical servers can’t be added to or modified easily to meet the changing data needs of your organization, so they need to be built with the capacity for future data needs. This is easier said than done given the rapidly changing landscape and the sheer volume of data you’re taking in. A modern data platform is incredibly flexible. It allows you to consider your data needs today and budget accordingly rather than trying to predict your data needs in the future which requires a significantly larger investment. As you need to increase data storage, adopt automation, or pivot in your data needs, these updates can be integrated seamlessly into the platform. 2. Improved access The people and applications accessing data need it in real time and in the proper format, but the needs of your data science team vary greatly from the needs of your business intelligence team. A modern data platform must support a faster time to market for data assets, and one way it does this is through a medallion architecture. A medallion architecture creates a multi-layered framework within the platform to move data through a pipeline to the end user. Bronze layer: Raw data is collected directly from the source systems with little to no transformation and stored here to provide a base layer of full history for additional processing. Silver layer: Data from multiple sources are curated, enriched, integrated, and organized in a structure that reflects the data domains of the organization. Gold layer: Data needed to support specific business drivers is aggregated and organized so it can be used for dashboard creation and self-service analysis of current states and trends. This architecture allows a diverse user base to access the data in the form that best suits their needs. Data scientists can access raw data from the bronze layer to identify new and emerging patterns, business applications can access data in the silver layer to produce data products, and business users can access the gold layer to perform analytics and create dashboards. 3. Incremental implementation Rather than transitioning to a modern data platform in a single, giant step, we recommend an incremental move. This makes it significantly easier and faster to focus on the current data products your organization needs, like reports and dashboards, while you are starting to build out the initial infrastructure. An incremental implementation lets you take a clear, informed look at the data you need, how you need it, and how it aligns with your business drivers. You can then choose to add, adjust, or stop processing certain data to put more focus on the data that will answer pivotal business questions. At the same time, building only what you need when it’s needed, an incremental implementation saves money and avoids bringing over old data that no longer serves your business. 4. Better communication between IT and business users A modern data platform needs to support improved communication between your IT or data engineers and your business users. As data flows through the framework and reaches the end user in the language they speak, the end-user has greater clarity. For business users, this may mean seeing gaps in how the existing data is not directly answering their questions and needs to find a different way to utilize the data. For the data engineers, this may mean seeing opportunities in how to filter out aberrations in the data to improve the aggregated data. This clarity allows the teams to work together to target solutions that will cover existing or emerging needs. 5. Re-focus valuable resources Once the initial data set is built, we apply repeatable patterns to the mechanics controlling data ingestion, storage, and delivery. Having a proven framework that can be repeated to unlimited data sets saves time and reduces the cost of building, operating, and maintaining the platform. Your data team can refocus their time on higher-level tasks, including improving data quality and speeding up delivery. Whether you have questions about data platform capabilities and functionalities or you’re ready to make the shift to a modern data platform, we’re here to help! Set up a call to talk to an expert or visit our modern data platform hub to learn more. Ask us your questions >> Learn more about modern data platforms >>
Promoting API discoverability
This panel was moderated by John Dages, Technology Solution Director at Fusion Alliance and included (left to right): Shawn Chapla, Technology Consultant at Fusion Alliance, John Heckler, Senior Engineering Manager at Kroger, and David Imhoff, SVP of Core Software Engineering at Fifth Third Bank. You’ve made a commitment to domain-driven design. You catalog everything you build in a curated, well-documented API portal. And yet, teams across your organization continue to waste effort building the same components. Why? You haven’t nailed API discoverability. Moving to an API-first organization is a long journey. Creating long-term teams around each service, getting components cataloged, and adding search features like tagging or badging is part of the process. But until you get the discoverability piece in place, you’ll continue to experience API bloat, as developers search for what they need, fail to find it, and build another one. Or a third one. Or a tenth. Enable API discoverability by centralizing the developer experience To effectively communicate API availability, particularly in a larger organization, consider going beyond the catalog. Organizing a developer portal around domains, adding documentation and a data dictionary, and making your API marketplace a place that centers the developer experience can go a long way toward good governance — and more effective use of your resources. Taking a wider view than just your API catalog can give your organization a more coherent pathway into composability, along with other tangentially related goals like your enterprise data organization, domain segmentation, and security. You’re not only building a framework for repeatable composition, you’re also making the developer process nearly seamless — freeing up those team members to spend their time-solving business problems and meeting concrete business goals. About our panelists David Imhoff is the SVP of core software engineering at Fifth Third bank, where he drives the bank's domain service API strategy. David has 17 years of software engineering and cybersecurity experience at Fifth Third, General Electric, and Cincinnati Bell where he has built cyber detection capabilities, led large software organizations, and is currently building out the core banking replacement for Fifth Third. David holds a BS in Information Technology from the University of Cincinnati and an MBA from Xavier University. He is passionate about mentoring and loves to help elevate the next generation of technology professionals. And he enjoys spending time with his wife and two daughters, golfing, and watching the Cincinnati Bengals. John Heckler is a Sr. Engineering Manager at Kroger responsible for Kroger's API Program and Platform. John has over 25 years of experience in software architecture and development, over 15 years of managing development teams, and a passion for building and motivating high-performing teams. John holds a BS in CS from Bowling Green State University and a Master of Computer Science from the University of Dayton. Shawn Chapla is a Consultant at Fusion Alliance in the Technology Practice area, focusing on helping customers accelerate their business through API- and cloud-driven digital transformation. Prior to joining Fusion, he spent more than 40 years delivering technology solutions in the pharmaceutical, defense, and computer systems industries.
Proving the value of composability
This panel was moderated by John Dages, Technology Solution Director at Fusion Alliance and included (left to right): Shawn Chapla, Technology Consultant at Fusion Alliance, John Heckler, Senior Engineering Manager at Kroger, and David Imhoff, SVP of Core Software Engineering at Fifth Third Bank. Every business wants to move with agility in the digital world. Most make customer-facing digital transformation plans — which is critical — but miss out on the backend technical frameworks that make digital aptitude possible. In a rapidly changing business environment, being able to pivot technologically can make or break a company’s viability. But how can you translate technical capabilities to concrete outcomes your leadership and business users can understand? How to build support for composability across the business Most C-level leaders are not technology experts, so to gain their sponsorship and alignment for the shift to composability, you’ll need to tie the technical gains you anticipate to concrete business objectives. After all, moving to a more composable, API-first framework is a culture change that calls for a new mindset. Many organizations find that the key to unlocking support for composability is tying the change to the organization’s long-term journey toward digital transformation. The API-first approach dovetails easily to that bigger picture, as a composable enterprise tends to be quicker to market and more efficient with developer resources, to say nothing of the profit associated with monetizing your API marketplace. As you start changing, be sure to quantify incremental value in terms of business goals and profit. Some examples might include: Showing how API and component reuse avoid future costs Demonstrating the ways that APIs and composable frameworks enable business wins Measuring time and cost to market for new initiatives The composable enterprise is a long game. In the short term, your organization may see building an API catalog or establishing a new architecture as a bottleneck to their plans. Proving value — both initially and along the way — can go a long way toward bringing the business with you on your path to modernizing your enterprise architecture. About our panelists David Imhoff is the SVP of core software engineering at Fifth Third bank, where he drives the bank's domain service API strategy. David has 17 years of software engineering and cybersecurity experience at Fifth Third, General Electric, and Cincinnati Bell where he has built cyber detection capabilities, led large software organizations, and is currently building out the core banking replacement for Fifth Third. David holds a BS in Information Technology from the University of Cincinnati and an MBA from Xavier University. He is passionate about mentoring and loves to help elevate the next generation of technology professionals. And he enjoys spending time with his wife and two daughters, golfing, and watching the Cincinnati Bengals. John Heckler is a Sr. Engineering Manager at Kroger responsible for Kroger's API Program and Platform. John has over 25 years of experience in software architecture and development, over 15 years of managing development teams, and a passion for building and motivating high-performing teams. John holds a BS in CS from Bowling Green State University and a Master of Computer Science from the University of Dayton. Shawn Chapla is a Consultant at Fusion Alliance in the Technology Practice area, focusing on helping customers accelerate their business through API- and cloud-driven digital transformation. Prior to joining Fusion, he spent more than 40 years delivering technology solutions in the pharmaceutical, defense, and computer systems industries.
Is data virtualization the future of data integration and management?
In a perfect world, all your data would be stored in an updated, organized database or data warehouse where your business intelligence and analytics teams could keep your company ahead of the competition by accessing the precise data they need in real time. In reality, as your organization has grown, your data has probably been stretched across multiple locations, including outdated databases, localized spreadsheets, cloud-based platforms, and business apps like Salesforce. This not only causes costly delays in accessing information, but also impacts your teams’ ability to make informed, data-driven decisions related to both day-to-day operations as well as the long-term future of your organization. So, how do you improve access to your data when it’s siloed in multiple areas? Data virtualization, while still fairly new, is an efficient, effective data delivery solution that offers real-time access to the data your teams need, and it is rapidly growing in popularity among large to enterprise-level organizations. While the market was estimated at $1.84 billion in 2020, a 20.9 percent CAGR has the data virtualization market projected to go beyond $8 billion by 2028 according to a 2022 Verified Market Research report. To help you determine if data virtualization solutions are the best option for your company, we’ll take a look at what data virtualization is, how it can solve your greatest data challenges, and how it stacks up to other data integration solutions. Understanding data virtualization First, what is data virtualization? When you have data housed across multiple locations and in various states and forms, data virtualization integrates these sources into a layer of information, regardless of location or format, without having to replicate your information into new locations. While this layer of data is highly secure and easily managed within governance best practices, it allows the data consumers within your organization to access the information they need in real time, bypassing the need to sift and search through a variety of disparate sources. Data virtualization supports your existing architecture Data virtualization does not replace your existing data architecture. Instead, it’s a single component in a larger data strategy, but it is often essential in executing the strategy successfully and meeting the goals of your organization. Think of your current data architecture as an old library where your data is kept on a variety of shelves, over multiple floors, and some of it is even stored in boxes in the basement. When you are looking for specific information, you have to go on an exhaustive, lengthy search, and you may not even find what you need. Data virtualization acts as the librarian who understands the organizational system, knows exactly where everything is located, and can provide you with the information you need immediately. Choosing data virtualization vs an ETL solution When reporting is delayed, analytics are inaccurate, and strategic planning is compromised due to bottlenecks, it’s essential that your organization prioritizes how data is integrated and accessed. Traditionally, organizations would only choose Extract, Transform, and Load (ETL). ETL is an intensive process in which all your data is duplicated from the original sources and moved into a data warehouse, database, or other storage. While ETL can bring your data together, there are two key problems with this method. The cost of moving and relocating data is often the chief concern most organizations have. And while it does improve your collection of data by keeping it siloed in one location, it doesn’t improve your connection to analyzable data that is needed to improve day-to-day operations. On the other hand, data virtualization solutions streamline how you access and connect to your data. Your business users submit a query, and the Denodo data virtualization platform pulls the data from across locations, extracts the relevant information and delivers it in real time in the needed format so it’s ready to analyze and use. The result? Increased productivity, reduced operational costs, and improved agility among business users while your architects and IT teams have greater control on governance and security. Take a deeper dive into data virtualization solutions Ready to dig deeper into data virtualization? We partnered with data management leader Denodo Technologies to put together Modernizing Integration with Data Virtualization, a highly informative webinar to help you learn how data virtualization helps your company save time, reduce costs, and gain better insight into your greatest asset. To learn how Fusion Alliance can create custom data virtualization solutions to scale your data management and improve access, reach out to our team. Ask us any questions or set up a quick call to explore your options. Learn more about modern data platforms >>
Data fabric vs data mesh: Choosing the best data architecture for your organization
Whether your data is housed in a monolithic data architecture or across multiple, disparate sources such as databases, cloud platforms, and business applications, accessing the specific information you need when you need it probably presents a huge challenge. The length of time it takes to find data may have you or your analytics teams constantly relying on outdated information to run reports, develop strategies, and make decisions for your organization. If you’re exploring data solutions that will improve time-to-market while simplifying governance and increasing security, you’ve probably come across the terms “data fabric” and “data mesh,” but you may not know how to apply them to your business. To help you better understand these emerging trends in data architecture, we’re digging into what a data fabric and data mesh are and the specific benefits they bring to large and enterprise-level organizations. This will give you the foundational knowledge to determine how to choose data fabric vs data mesh or how both may be able to serve your organization. What is data fabric? When you think of every bit of data in your organization as an individual thread, it makes sense that it takes so long to access specific information. If thousands of individual threads are stored together in a bin, like in a monolithic architecture, or separated across hundreds of individual boxes with little to no organizational method, like in a distributed architecture, how long would it take to find the single thread you’re looking for and get it untangled so you can use it? A logical data fabric solves this problem by weaving all the threads of data together into an integrated, holistic layer that sits above the disparate sources in an end-to-end solution. Within the layer are multiple technologies working together to catalog and organize the data while machine learning and artificial intelligence are implemented to improve how new and existing data are integrated into the fabric as well as how data consumers access it. Are data virtualization and data fabric the same? A common misconception is that data virtualization and data fabric are the same. On the surface, they both support data management through the creation of a single, integrated layer of processed data atop distributed or unstructured data. Data virtualization is an integrated abstraction layer that speeds up access to data and provides real-time data returns, and this technology is a key component within the data fabric. However, data virtualization is still only one of the multiple technologies comprising the entity, which is a more comprehensive data management architecture. Benefits of data fabric Now that you have a better understanding of what data fabric is, let’s consider the problems it solves and why it may be right for your organization. Access your data faster When your data is in multiple formats and housed in a variety of locations, gaining access to the specific details you need can take hours, days, or even weeks, depending on your architecture. A logical data fabric leverages metadata, semantics, and machine learning to quickly return the needed data from across multiple sources, whether it’s a large amount of historic information or highly specific data used to drill down into a report. Democratize your data Data fabric uses advanced semantics, so the data is accessible in the language of business users, such as BI and analytics teams. Data consumers within the organization can access what they need without having to go through data engineers or the IT department, eliminating bottlenecks and sharing ownership of data. Improve governance Because of the automation capabilities of data fabric, you can implement a governance layer within the fabric. This applies global policies and regulations to data while allowing local metadata management to reduce risk and ensure compliance. What is data mesh? Monolithic data architecture keeps data in one centralized location. On paper, this seems like a more cost-effective, efficient option compared to a distributed architecture, but it still brings several challenges. Consider that in many large organizations relying on a monolithic architecture, massive volumes of unstructured data are stored in a data lake. For information to get into the hands of data consumers or before productization can occur, the data must be accessed and processed through the IT department, creating significant bottlenecks and bringing time to market to a crawl. A data mesh can solve this challenge. This is a new type of data architecture, only proposed in 2019 by Zhamak Dehghani of Thoughtworks, in which a framework shifts data from a monolithic architecture to a decentralized architecture. More specifically, the data is distributed across autonomous business domains where the data consumers own, manage, and share their own data where they see fit. While the domains are given a separate virtual schema and server so they can have full ownership over data productization, governance, security, and compliance are still unified within the monolith. Benefits of data mesh The challenges of centralized data ownership include latency, added costs of storage, software, replication, and lack of practical access for consumers, but implementing a data mesh can solve these. Eliminate IT bottlenecks When all data is forced to go through the IT department before being distributed to the individuals or teams requesting it, bottlenecks occur and slow down the flow of data. A data mesh allows data to bypass the IT department, allowing data to flow freely to the needed source. Improve flexibility and agility Finding specific information within the massive volume of unstructured, undefined data stored in a data lake requires increasingly complicated queries to get the needed information. However, a data mesh gives ownership of datasets to individual teams or business owners, simplifying access and offering real-time results through scalable, automated analytics. Increase connection to data By transferring data ownership to the data consumers, those who use it directly have a greater connection to it. The data is available in the language of business, and it can be shared across teams with greater ease and transparency. Choosing data fabric vs data mesh Data fabric and data mesh both support data democratization, improve access, eliminate bottlenecks, and simplify governance. While data fabric is built on a technology-agnostic framework to connect data across multiple sources, data mesh is an API-driven, organizational framework that puts data ownership back in the hands of specific domains. So, which is better in the debate between data fabric vs data mesh? The simple answer is neither one is better than the other, and the right option is determined by the use case. If the goal of your organization is to streamline data and metadata to improve connection and get real-time results across multiple teams, a data fabric built on a data virtualization platform can help you meet your goals. On the other hand, if you need to improve the process of data productization and decentralizing your data, a data mesh may be the best option. But the real answer is that contrary to popular belief, the two are not mutually exclusive and most businesses succeed by implementing both options. Data fabric and data mesh are complementary solutions that can work together to solve the challenges of your existing architecture. Learn more about data fabric and data mesh Want to gain further insight into choosing data fabric or data mesh? We partnered with data management leader Denodo Technologies for a recorded webinar. In Logical Data Fabric vs Data Mesh: Does It Matter? we provide an in-depth look at monolithic and distributed data architecture, the challenges they bring, and how both data fabric and data mesh can improve agility, reduce costs, and elevate the quality of your data. To ask additional questions or learn how Fusion Alliance can help you create and implement a successful data strategy to meet your unique challenges and goals, connect with our team today. Learn more about modern data platforms >>
Modernizing technology (without threatening civilization)
Robots are (probably) not the answer If we’ve learned anything from films like 2001: A Space Odyssey, Terminator 2: Judgement Day, and Avengers: Age of Ultron, it’s that we need to keep vigilant watch for robot uprisings. But if we’ve learned anything from the past three years, it’s also that we need to be vigilant to keep our technology future-ready. Legacy systems are all well and good when the world is operating smoothly. But “smoothly” isn’t how the pace of business rolls these days. So, how do you shore up your IT architecture while keeping costs low, maintaining efficiency, and keeping operations on an even keel (even when the world is not)? 🤖 Option 1: Build a team of robots to handle your business processes, hoping for more WALL-E and less T-1000. OR 👨💻 Option 2: Leverage flexible, adaptable technologies to optimize workflows, modernize legacy systems, and give your business the agility to meet change head-on. 📖 Pro Tip: We recommend option 2 for efficiency, cost-effectiveness, and to preserve humanity from a robot uprising. Want to learn more about emerging technology trends? We can help you chart the right path forward. Want to learn more about robot uprisings? Both Terminator 2 and 2001: A Space Odyssey are streaming on HBO Max!
Annual planning is a bunch of hocus pocus
Working on your marketing plan? Time to form a calming circle Don’t get spooked, but you’ve only got two more months to pull your 2023 marketing plan together. And if that dire and stressful thought has you feeling like the icy breath of death is breathing down your neck (just us?) you could: Put on a scarf Hope somebody on your team has an epiphany about personalization Get your customer data strategy back on track Here’s your treat: You can tackle customer data without resorting to potions. But you do need a recipe. In our Customer Data Strategy Workshops, we bring marketing, data, and technology stakeholders together to map your data and processes, quantify your risk, and map out a modernization plan. Fun-size: Not sure how to dive in? We hosted a digital roundtable event in December where marketing leaders from multiple industries talked over real-world issues like CDP vs MDM, handling data security in regulated space, and how broader 2023 trends impact your martech and C360 strategies. Don't worry we have the recording in case you missed it! Check out the full video here >
Is your data cereal or soup?
The importance of data classification Often presented as a click-bait internet poll, the question “Is cereal a soup?” is only baffling until you realize that the answer hinges on how you define the term. Merriam-Webster contends that soup is a liquid sustenance often containing pieces of solid food. Therefore, as one respondent said, cereal is a soup “technically, though not existentially.” Proper definition of terms is also critical when it comes to classifying your data. To get the most from your data assets, you’ll need a strong data strategy, supported by definitions like: How information is grouped, weighted, and prioritized How common dimensions will be conformed How data will be standardized, cleansed, and tagged Your data use cases, sources, and architecture are unique. How you define your data strategy should be, too. Fusion’s team of data, technology, and digital experts can help you architect and implement a comprehensive data strategy, offer insights and best practices to support a growing data culture, or step in to solve a particular problem. Don’t let data eat your business for breakfast. Learn more about defining your data terms or get in touch for a quick consultation.
Accelerating time to value by implementing logical data fabric
Today’s businesses collect more data than ever before, but many don’t have the architecture in place to store, process, and recall the data in real time. Whether an enterprise-level organization stores all its data in a single data lake or relies on multiple, disparate sources, both options cause significant delays in finding the specific information you’re looking for. Traditionally, if your organization wanted to update and upgrade the existing architecture, the only option was extract, transfer, and load (ETL) the data to a new framework but implementing a logical data fabric offers a better alternative — giving companies a cost-effective, efficient way to collect and integrate data while building a stronger framework across the organization. At a recent CDO Data Summit, Mark Johnson, Fusion Alliance Executive Vice President and editorial board chair for CDO magazine, sat down with thought leaders in the data industry to discuss why logical data fabric is essential in accelerating time to value. What is a logical data fabric? When you have multiple disparate data sources, a data fabric acts like a net cast over the top, pulling individual information sets together in an end-to-end solution. Data fabric is a technology-driven framework that lies within the existing architecture, unlike a data mesh, which is a methodology regarding how data should be distributed among data owners and consumers. In a logical data fabric, multiple technologies are implemented to catalog and organize existing data and integrate new data into the fabric. Data virtualization is the central technology deployed within this framework, creating an abstracted layer of unified data that is more secure and easily accessible. What challenges are solved by a data fabric architecture? Logical data fabric architecture offers a solution to the challenges organizations relying on numerous data storage solutions or repositories of structured and unstructured data face: Overcome slow data delivery By consolidating data into an integrated semantic layer, common business applications can process, analyze, and return the data in real time, in the language of the data consumer. This improves accessibility and significantly reduces latency that comes from applications having to search across multiple sources to return information. Simplify governance If every data warehouse, database, and cloud-based platform within your organization relies on separate governance, you are dealing with significant inconsistencies. By stitching the data together in a logical data fabric, centralized governance can be applied across all data and automated to maintain and streamline the process. Reduce IT bottlenecks Data fabric automates how data is processed, integrated, governed, and utilized, enabling real-time analytics and reporting. This puts data in the hands of your BI and analytics teams more quickly while removing bottlenecks from your IT department. With a logical data fabric architecture, your business can respond to trends and changes within your industry more quickly, helping you to evolve both short and long-term strategies to reflect what your data is telling you in real time. Is a logical data fabric the right solution for your organization? Learn more about data fabric architecture from the CDO Data Summit’s round table discussion. Mark Johnson is joined by: Baz Khauti, President at Modak USA Richie Bachala, Principal, Data Engineering at Yugabyte Ravi Shankar, SVP and Chief Marketing Officer at Denodo Saj Patel, VP of Data Solutions at Fusion Alliance This panel addresses critical questions about data in today’s business to help you solve your unique data challenges, including: Is the fabric of data virtual, physical, or both? How do we get value out of our data? Do we take a connect or collect approach? How comprehensive do we need our data approach to be? Are we optimizing for agility or for flexibility? How do we deliver unified data? Is the organization in agreement with what we are looking for out of their data? What AI/ML techniques do we want to employ, if any? If you have specific questions or are ready to take the next step and learn how we can help you create custom data solutions for your organization, reach out to us today for a quick chat! Learn more about modern data platforms >>
Help vs hype: branded communities
Unpacking branded communities At its Inbound 2022 conference, HubSpot unveiled a new community of practice, connect.com, where HubSpot marketers can share ideas and build connections. In recent years, many companies have made forays into community building, with mixed results. An online community can help companies build stronger customer relationships, discover important product and consumer insights, and drive sales. Or it can be a waste of time. If you want an online community that drives business value, you’re going to need a strategy. Real people don’t bond around yet another platform, app, or notification. To drive high-impact results, successful community strategies typically include: Audience awareness: If you haven’t already, this is the time to do persona research and carefully assess your customer journeys. “Community” means different things to different people — and in different contexts. Knowing your target audience can help you craft experiences that resonate. Authentic value: Is your audience looking for ideas, collaboration, training, or access to learn from experts? Where does their need intersect with your organization’s personality and expertise? For your brand’s online community to thrive, it needs to fit your culture and deliver real value to your target members. Aligned platform: Whether it’s a Facebook group, Slack channel, or fully branded platform experience, the technology that hosts your community matters. The platform experience impacts engagement and retention, and ultimately determines how much tangible ROI you’ll see in the form of insights, feedback, and pipeline. Interested in exploring what an online community could do for your brand? We can help. From customer research to complete community engagement strategies, Fusion partners with mid-market and enterprise companies to reimagine customer connection. Learn more >>
GA4 transition meltdowns, and how to deal with them
For many, the approach of the Google Analytics 4 takeover brings nothing but dread. And we get it. Although GA4’s new insights and methodology can be an incredible boon to businesses that want to improve their user targeting, it’s a huge adjustment, and there’s no simple solution. Below, we cover the most commonly voiced pain points of a GA4 transition and offer ideas on how to adapt, so you can keep your team from burning their laptops and heading for the hills. I can’t export my data from UA into GA4. Transitioning from Universal Analytics to Google Analytics 4 isn’t a simple, one-to-one migration. Why? Because GA4 operates on an entirely new user-based model. UA uses a session-based model that collects data from actions taken by the user within a particular session. GA4, on the other hand, relies on an events-based approach: meaning, instead of looking at goals per session, you’re looking at events per user. This allows us to track user behavior across multiple devices and platforms. While this means better insights into the target audience, it also means migration isn’t as easy as just importing your UA data into GA4. You’ll need to audit the metrics you’re currently using and rethink them in light of GA4’s new data tracking methods. Get tips for GA4 migration without the migraine >> My views are no longer available. If you’ve spent any time in Google Analytics 4, this loss may be the most jarring discovery you’ve made. Raw, test, or official — all the most commonly-used views are gone. Before you have a panic attack, breathe into a paper bag and read on. GA4 doesn’t use views because the interface is set up for you to create custom data filters. For those of us that aren’t data experts, this can seem daunting at first, but it gives the opportunity for you to track the information that’s going to be most valuable to your company. Here are three things you should know about data filters: The filters you create are permanent. This means you should be certain of what you’re tracking before you set up a filter. Use “test mode” to preview your filtered data before you set up the filter for good. You can set up to 10 filters. You likely won’t have to worry about maxing out your filters, but again, do some internal testing to prioritize what filters you need so you’re not taking up real estate with a data view that’s not going to provide long-term insights. You can filter out internal IP addresses. No more worrying about throwing off metrics every time you test a lead form! You can set data filters to exclude your team’s activity so you’re only seeing what’s happening from external visitors. Learn more about building custom reports in GA4 >> Bounce rate is changing. Bounce rate is one of the elite metrics that many look to as a gauge for page effectiveness. But the truth is, bounce rate may not be as helpful as we’ve believed it to be. The GA4 transition doesn’t quite do away with bounce rate, but you’ll find the metric in a different form. Instead of a 1-for-1 replacement of UA’s bounce rate, GA4 provides a different way to measure page effectiveness, in the form of “engaged sessions.” So, instead of tracking who comes to your site and doesn’t interact, it measures those that come to your site and do interact, and monitors which actions they take. Thus, you can calculate bounce rate by looking at it as the inverse of engagement rate. However, bounce rate is much less insightful than engagement rate, so although the drastic change may be awkward at first, we’re confident that most users will warm up to this method and even find it more beneficial in the long run. The GA4 user interface is difficult to use. For users who’ve spent the past 15 years using UA’s out-of-the-box reporting, GA4 can be an upsetting experience, to say the least. UA’s pre-defined reports provide valuable data, but GA4 has a serious leg-up over its elder counterpart: custom reporting. You can recreate most of UA’s reporting by creating custom reports with GA4’s Exploration tool. What this means is, you’re not getting less functionality — you’re getting more ways to view your data, and you can tailor them to your business and desired outcomes. Will that make the interface any less weird or uncomfortable? Sadly, no. But with some training and a little practice, users will acclimate to the change and begin to see the benefits of custom reporting and the far-more-accurate data it provides. Adapting to the wide world of GA4 Although GA4 will take some getting used to, the new model gives a simultaneously holistic and granular view of user behavior that offers better insight into your target audience — and how to reach them. If you’re ready to take the plunge into GA4 but still apprehensive, our Google Analytics 4 Survival Guide can give you the tools and tips you need to start out on the right foot. Or, if you want a real person to talk you through the process and help you get set up, talk to a Fusion Alliance analytics expert today. Get the GA4 Survival Guide >>
Building GA4 custom reports (you’re going to need them)
In the golden era of Universal Analytics (UA), Google pre-packaged a comforting array of reporting right out of the box. But, as companies transition to Google Analytics 4 (GA4) in preparation for UA’s planned sunset in 2023, marketers have been surprised to find far fewer of those pre-set analysis tools, and many are scrambling to rebuild the reports they rely on for key metrics. For example, if you check your UA account for acquisition, you’ll find roughly 25 different reports you can tap into right away. If you check acquisition in Google Analytics, on the other hand, you’ll see an overview screen and…two reports. But there’s no need to panic. While switching from UA means you give up those pre-packaged reports, what you gain from GA4 is the opportunity to collect data and analyze it in ways that make the most sense for your business. In this article, we’ll point you to places where you can find the UA reports you’re used to in GA4, and then we’ll show you how to build GA4 custom reports that fit your business needs. Forget the good old days. The best is yet to come. Find your favorite UA reports in GA4 While you might not be able to find a one-to-one match for everything you’re used to using in UA, GA4 does offer some reasonable facsimiles, although the naming may be different. Acquisition Reporting → Traffic Acquisition If you use UA’s Acquisition Reporting to answer questions about website traffic, you can find some similar metrics in GA4’s Traffic Acquisition. You’ll notice that Traffic Acquisition is set up in a similar format, but — and this is a big hurdle — you won’t be able to drill down into the data with a few quick clicks in GA4 like you can in UA. In GA4, instead of clicking around to find information, you use the plus sign (+) to set up secondary dimensions when you want to drill down into information. As you set up secondary dimensions, you’ll be able to search and narrow down the data to determine the best way to answer the questions your business is asking. In this case, GA4 shows you the same information you found in UA, but in a more targeted, deliberate format. Bounce Rate → Engagement Rate At first, it seemed bounce rate had bounced out of the analytics arsenal entirely with GA4, but now it seems the metric most responsible for marketing panic attacks is back, but in a slightly different form. Bounce rate does exist in GA4, but it’s calculated a bit differently because of GA4’s different data model. So, if you compare your current UA bounce rate to GA4, you will see a difference, and you’ll need to set new benchmarks. GA4 introduced a new metric to try and give us better information about how visitors use our websites: engagement rate. Unlike UA’s bounce rate, GA4’s engagement rate measures people who stay on your site and actually stay engaged. It’s a bit more dimensional than bounce rate, but also a little more difficult to manipulate. You can export engagement rate data into Excel if you’re up for doing a little more digging, but this report is one that might benefit from customization. Audience Overview → Demographics Overview Similar name, same functionality! As in UA, in GA4, Demographics Overview gives you a quick snapshot of your users, including: New vs returning users Demographic data Browser and operating system access Content Drill-Down → Pages & Screens Report In UA, the Content Drill-Down report gives a view of how site content performs at a hierarchical level within the URL structure. In GA4’s Page & Screens Report, on the other hand, you see your content by page title, but not by section. You can change the GA4 report view to page path, which allows a little bit more clarity, but the interface doesn’t support clicking around into different sections and paths. A few workarounds may help: Use the search function to look up different sections of your website, like “blog,” “about,” “services,” and so forth Export content to Excel to group and compare different sections against each other Use Explorations rather than Pages & Screens to dig into specific content performance questions Explorations When you can’t find a 1:1 match for a UA report you used to rely on, you could use GA4’s Explorations function to rebuild an exact match, but you could also take the opportunity to fine-tune the report to answer questions in an even better way. Within the GA4 Explore tab, users can build their own detailed reports, called Explorations, from a gallery of templates. We expect that this library will continue to grow but the baseline options are already quite useful. Of course, as you build out your own custom GA4 reports, you’ll want to start from a list of defined questions that serve your own internal goals, KPIs, and requirements. But to help you get the hang of how to create your own Explorations, we’ll outline a few examples here, based on UA reports you may be used to using. Behavior Flow → Path Exploration This report delivers a segmented view of website traffic. For example, to find out how many site visitors get to your contact page via organic search, you can create a path exploration by: Navigating to the GA4 Explore tab Choosing the path exploration template Clicking organic search sector Clicking the path to find how many of the visitors in this segment visited the contact page Behavior Flow → Funnel Report You can create a similar version of the path exploration report with a funnel report, adding more detail about the steps you want to analyze. You can set this report up by: Navigating to the GA4 Explore tab Choosing the path exploration template Clicking organic search sector Clicking the path to find how many of the visitors in this segment visited the contact page Adding form submission as a requirement Editing and adding steps to change the desired action This report can give you a better idea of common visitor behavior flows on your website. It’s a highly customizable GA4 report, both by steps and the dimensions of behavior you can track those journeys across the site. Exit Pages → Free Form Explorations Although you can’t find exit pages and exit page percentages out-of-the-box with GA4 as you can in UA, you can use free-form explorations to create a custom GA4 report to get you that data. Here’s the setup process at a high level: Navigate to the GA4 Explore tab Select the Free Form Exploration template Set up the page path you’re tracking, including exits and sessions Compare which pages have the most exits to total sessions to get the percentage While you can’t find the percentage in a calculated column, this report is a helpful replacement if you need to find this data quickly. Using Free Form Explorations, you can take any of your metrics and add any dimensions to home in on your data at a very close level. More options for building GA4 custom reports As you dig into GA4, you may find that you can add new dimensions and metrics to the tables for some of GA4’s limited out-of-the-box reports, but you may find that the results lack the depth you need. Depending on the types of reporting you need, you may also find that Explorations give you enough common views to replace most of what you find in UA. However, most marketers will need to get a step further in their GA4 reports before completely moving away from UA. With GA4 still in flux and new features and functionality shifting, Looker Studio (formerly Google Data Studio) may offer your team a way to find consistency and recreate some of the views you were used to using in UA with your GA4 data. Shifting from UA outputs to GA4 custom reports isn’t easy To make the switch from UA to GA4 seamless, you might need to call in reinforcements. Check out our GA4 resources or let us know if you have a specific question. Our team is helping mid-size businesses and enterprise-level organizations handle every aspect of the GA4 transition, and we’re happy to help you get what you need to be successful. [Match-up] Understand the differences between Universal Analytics and Google Analytics 4 [Path] What you need to know to upgrade to GA4 [Plan] 6 steps to make your Google Analytics 4 transition easier [Video] Meet the new Google Analytics 4 [Contact] Ask us anything
Time to retire your tech debt
Your grandpa wants his solution back Reality television is nearly 75 years old. Shortly after World War II ended, your great-grandparents may have watched Candid Camera on ABC. Nowadays, Survivor is on its 43rd season and the genre continues to crank up the cringe factor with increasingly dubious show premises in hopes that they’ll stumble into a hit franchise. Technology can be like that. Organizations invest in ground-breaking technology solutions, then add patches, fixes, and enhancements over time. In the short-term, companies can achieve significant process and resource savings by building onto an existing solution. But in the long-run, legacy technologies struggle to keep competitive pace, the enterprise architecture takes on a life of its own, and the organization starts missing opportunities. This is tech debt — the bane of IT departments at just about every mid-market to enterprise business. Gartner recently found that even in the face of “irrecoverable failure” of legacy solutions, many executives favor compounding technical debt by continuing to build on end-of-life systems, rather than investing in a new solution that better fits the needs of today’s business. It’s understandable. Starting over from scratch is a huge investment, with a large potential downside if it doesn’t work. Executives facing the high probability of a recession are more likely to hedge their bets. The answer, once again, is composability. At our most recent APIs Over IPAs panel discussion, IT executives from NetJets and OCLC talked over a variety of topics, including how to overcome tech debt by making the business more composable. If you couldn’t make the event, check out these short video clips: How composability can reduce tech debt How to increase API adoption (without resorting to buzz word strategy) How to decide between a COTS or custom API gateway tool Your end-of-life technology may not be quite as old as reality TV, but there’s no need to trail along with a has-been solution. Fusion's API experts partner with technology leaders from mid-market and enterprise brands to talk about the latest trends in modular applications. Find out how to get blockbuster results (without scary budgets) through composability. Register for our next APIs Over IPAs event
Customer IQ: The new imperative
Data has always been integral to organizations. However, as customer expectations continue to evolve, data-driven insights have proven integral to optimizing customer relationships. This has resulted in data leaders not only being integral to transformation but often leading the transformation. From finding, acquiring, serving, and retaining customers to predicting and delivering those customer moments that matter, data and analytics have emerged as essential enterprise competencies. At this year’s MITCDOIQ Symposium, our Regional Vice President and Executive Data Leader Mark Johnson sat down with a panel of experts, including Todd James (Chief Data and Technology Officer at 84.51), Chris Tambos (VP of Data & Analytics at Fortune Brands Water Innovations), Eric Wiegand (Industry Expert), David Levine (VP of Solution Sales at Fusion Alliance), and Saj Patel (VP of Data Solutions at Fusion Alliance). Together, these industry leaders illuminate the outcomes and successes your business can see at the intersection of digital, data, and analytics and present emerging best practices that will ensure your success. As business models pivot to meet the ever-evolving needs of customers and organizations, these experts explain how their organizations answered some of their biggest challenges, including dealing with pandemic-related changes using data and analytics. This panel also covers how data fits into your bigger business strategy. Marketing has traditionally been the biggest consumer of customer data, but now we are seeing how important data is to all areas of the organization. With a wealth of knowledge between these great data minds, this panel provides information you won’t want to miss that can help you make the right choices and provide the ultimate customer experience.
“API-First” is a buzzword, not a plan
This panel was moderated by John Dages, Technology Solution Director at Fusion Alliance (left) and features Ryan Shondell, Executive Director of Data Services at OCLC (center) and Jeremy King Chief Enterprise Architect at NetJets (right). Good technology executives know that good directives must be followed up with action plans that create value. So how do you get the business on board with an API-first strategy? How do you enhance API adoption at the enterprise level? Solve an actual problem. Think of a legitimate felt need within a service or a function in your span of control and use microservices, headless, cloud-native, or composable functions to handle it. Buzzwords don’t drive API adoption. Follow-through does. Show the business how an API-first strategy makes their job easier. The average business user doesn’t care about fancy architecture. They don’t care if you use microservices, make something headless, or are running a composable enterprise unless it makes a difference in their day-to-day. They want the website to be faster or to have more capabilities so they can drive their own KPIs. So how do you drive API adoption when your business users are thoroughly tech agnostic? You build change management into your roadmap and win hearts and minds with results. When it comes to getting the business on board for API-first initiatives, the only way to build trust is demonstrable progress. About our panelists: Ryan Shondell is currently the Executive Director of Data Services at OCLC, responsible for developing and executing the company’s data strategy and aligned technology. This includes technical product management, data operations, data quality, and development of AI/ML capabilities, analytics, search, and all customer-facing data applications and APIs across a staff of 300. Prior to joining OCLC, Ryan held multiple senior engineering leadership positions at VMware going back to 2010, most recently as Senior Director of Engineering, where he helped to lead global development on products like Skyline and VMware Cloud. And now, he’s actually headed to Path Robotics to start his next adventure. Jeremy King has been working in Technology for over 20 years and is currently the Chief Enterprise Architect at NetJets. He started his career designing and developing embedded systems and has worked in many industries including banking, health care, travel and transportation, and integration tools. His background includes distributed cloud-native architecture, data structures and modeling, enterprise integration patterns, event-driven architectures, and API design. As a Software Architect, Jeremy has faced the challenge of making disparate systems exchange data in consistent, performant ways. His current passions include technical innovation, graph databases, and emerging API standards.
API gateway tools: build or buy?
This panel was moderated by John Dages, Technology Solution Director at Fusion Alliance (left) and features Ryan Shondell, Executive Director of Data Services at OCLC (center) and Jeremy King Chief Enterprise Architect at NetJets (right). The decision to build or buy API gateway tools for your organization is rarely black and white. Building an API gateway pulls your team away from other opportunities, but buying could lead to vendor lock-in. Off-the-shelf solutions might not give you differentiating advantage, but inventing your own protocols could accelerate your tech debt. How do enterprise businesses weigh the trade-offs? When to consider building an API gateway tool The custom tool will deliver differentiated value directly to customers or enable the business to deliver that value The custom tool solves a consumer or business problem that delivers market value The custom tool gives you a competitive edge in functionality, cost, or speed to market When to consider buying an API gateway tool Your IT time and talent is needed on other revenue-driving projects Your team could use an out-of-the-box tool as a platform, and build your custom functionality on top of it as an accelerator The tool conveys significant ongoing maintenance and support savings Know which shark is closest to the boat It’s not always simple to project future savings or quantify possibility. If you’re faced with a build or buy tech decision, sometimes you have to solve for the biggest issue at hand. That could mean you buy an API gateway tool or another off-the-shelf solution. Yes, it’s a vendor dependency, but every other part of your business has them, too. If you make a forward choice, and keep your eyes open, you can avoid many of the pitfalls associated with vendor lock-in. For example, finding components that are portable, avoiding proprietary pieces, and limiting the specialized components you buy outright can help. And it’s always wise to create a backup plan. As they say, “don’t have such an open architecture that your business falls out.” About our panelists: Ryan Shondell is currently the Executive Director of Data Services at OCLC, responsible for developing and executing the company’s data strategy and aligned technology. This includes technical product management, data operations, data quality, and development of AI/ML capabilities, analytics, search, and all customer-facing data applications and APIs across a staff of 300. Prior to joining OCLC, Ryan held multiple senior engineering leadership positions at VMware going back to 2010, most recently as Senior Director of Engineering, where he helped to lead global development on products like Skyline and VMware Cloud. And now, he’s actually headed to Path Robotics to start his next adventure. Jeremy King has been working in Technology for over 20 years and is currently the Chief Enterprise Architect at NetJets. He started his career designing and developing embedded systems and has worked in many industries, including banking, health care, travel and transportation, and integration tools. His background includes distributed cloud-native architecture, data structures and modeling, enterprise integration patterns, event-driven architectures, and API design. As a Software Architect, Jeremy has faced the challenge of making disparate systems exchange data in consistent, performant ways. His current passions include technical innovation, graph databases, and emerging API standards.
Is your composable enterprise project working or causing tech debt?
This panel was moderated by John Dages, Technology Solution Director at Fusion Alliance (left) and features Ryan Shondell, Executive Director of Data Services at OCLC (center) and Jeremy King Chief Enterprise Architect at NetJets (right). How many meetings do you have with Amazon when you want to use S3 to move a workflow into the cloud? None, of course. Amazon makes its S3 service easy to find, understand, and consume. And how many meetings do you have to have when you add an API to your own organization’s technology ecosystem? The “how many meetings” test isn’t a trick question. It’s a good rule-of-thumb metric to judge whether your service is driving value or contributing to tech debt. Bottom line up front: APIs must be consumable to add value. If you’re building an API and you have to have a meeting before someone can use it, something has gone wrong and you may need to reconsider your composability approach. If you build an entire composable ecosystem that works the same way the world used to work 20 years ago, you’re cruising for tech debt rather than ROI. A functional composable enterprise requires components that are discoverable, with documented constraints and functionality, so that operators can find what they need and put it into place without a lot of hand holding. This discoverability could come from traditional documentation, or effective use of introspection endpoints to allow for programmatic, systemic discoverability. Either way, to reduce your tech debt and boost the ROI of your composability, be sure to ask yourself: how many meetings did we have before we could offer or consume this product? The closer that number is to zero, the better. About our panelists: Ryan Shondell is currently the Executive Director of Data Services at OCLC, responsible for developing and executing the company’s data strategy and aligned technology. This includes technical product management, data operations, data quality, and development of AI/ML capabilities, analytics, search, and all customer-facing data applications and APIs across a staff of 300. Prior to joining OCLC, Ryan held multiple senior engineering leadership positions at VMware going back to 2010, most recently as Senior Director of Engineering, where he helped to lead global development on products like Skyline and VMware Cloud. And now, he’s actually headed to Path Robotics to start his next adventure. Jeremy King has been working in Technology for over 20 years and is currently the Chief Enterprise Architect at NetJets. He started his career designing and developing embedded systems and has worked in many industries, including banking, health care, travel and transportation, and integration tools. His background includes distributed cloud-native architecture, data structures and modeling, enterprise integration patterns, event-driven architectures, and API design. As a Software Architect, Jeremy has faced the challenge of making disparate systems exchange data in consistent, performant ways. His current passions include technical innovation, graph databases, and emerging API standards.
Data and the digital dragon
Forge your strategy in Valyrian steel Brilliantly harnessing the news cycle, Gartner released its report on how to become a digital dragon just as HBO premiered its spinoff Game of Thrones series that details the risks and opportunities of becoming a dragon in the geopolitical sense. Not to spoil the plot of either narrative, but dragons (of the digital or CGI variety) are not simple to create, tame, or control. That’s where data comes in. In the developing digital ecosystem, a solid data strategy harnesses that dragon energy and keeps the organization ready to pivot at the right time. Here’s how Gartner’s three steps to digital dragon status play out for data: Increase risk appetite: To be adaptive, spot potential immediately, and act fast, you need to be able to trust your data. Inject cognitive diversity: To keep your diverse and flexible talent pool from getting trapped in information silos, you need to be able to deliver your data. Incubate digital dragons: To cultivate a curious, entrepreneurial culture in your organization, your executives and business units need to be able to leverage your data. No matter where your organization is on the path to becoming a digital dragon, data’s role is to drive better decisions, better customer experiences, and better access to information across the business. Data strategy is a key component to any successful digital transformation. Forge your data strategy with Valyrian steel (metaphorically speaking). Let’s talk about where you are on your journey, and how we can help you get there faster. Get Smart: If the summary article linked above left you wanting more, the (gated) Gartner report you’re looking for is: “The Armed and Potent Digital Leader: How to Become a Digital Dragon.” For those without the stomach for another foray into Westeros, but who still have a taste for dynastic feuding and magical flying (of a sort), you might check out Alex Bledsoe’s Appalachian fairy tale The Hum and the Shiver.
Cage Match: Google Analytics 4 vs Universal Analytics | Fusion Alliance
Marketers at mid-market companies are going through the stages of grief over Google Analytics 4. Some are still in denial (“Whatever, we have Universal Analytics until July 2023, and they will probably extend the deadline.”) and others have moved on to anger (“Google Analytics sucks!”) If you’re actively wrestling with the transition, you’re probably squarely in that cage match stage. We get it. Our team has been helping businesses with GA4 implementations since it first rolled out, and making the switch isn’t easy. Along the way, we’ve found four key matchups that might help you avoid getting KO’d by your Google Analytics 4 transition. Google Analytics 4 vs Universal Analytics Google Analytics 4 Universal Analytics Round 1: Tracking Users Events Sessions Goals Round 2: Data Engagement Bounce rate Round 3: Interface Low clickability Drill down analysis through setting up secondary dimensions High clickability Drill down analysis through ad hoc clicking in results Round 4: Reports Explorations Embedded reports Round 1: Tracking In Universal Analytics, we track metrics by sessions tied to behavioral goals. Here’s one common scenario: Patricia visits your website home page on her phone one day and bounces. A few days later, she scrolls an article from your site on her desktop and bounces. The following week, she reads a case study on her iPad while waiting in the carpool pickup line and fills out a contact form. At this point, UA documents three sessions, and one goal achieved. But is that the best way to document what actually happened here? Google thought not, and pivoted. Google Analytics 4 tracks metrics according to users and events. Using the same scenario, GA4 documents that one user took four actions on your website: home page, article, case study, form fill. You see the progression, and the impact your site has, at a more granular and yet more holistic level. With GA4’s user-based data model, you see what users do when they interface with your brand — regardless of the device they use, location they come from, or platform they visit. Instead of goals per session, you’re now able to see events per user. Round 1 winner: GA4 Round 2: Data When July 2023 rolls around, a lot of people are going to be caught by surprise when they can’t just upload their Universal Analytics data directly into Google Analytics 4. Next, they will try to export their data from UA to GA4 and that will…also not be possible. Sorry, folks, it simply will not work. The data models are different, and Google has not laid down a path to make a simple data port happen. How can you counter that punch? Your best move is to get started in GA4 as soon as possible, so you can begin to collect new data and understand how those metrics will influence your overall reporting. And the sooner you start collecting GA4 data, the sooner you’ll be able to use it in your year-over-year metrics. Because the data points aren’t the same, your old UA data won’t be of much use in year-over-year comparisons. Learn more about how switching to GA4 impacts YOY reporting >> The shift that impacts almost everyone changing over from UA to GA4 is the shift in bounce rate. Long used as a benchmark for user engagement, UA’s bounce rate metric didn’t make it to GA4 in quite the same format. Instead, GA4 measures engagement in more absolute terms. GA4’s new “engagement” metric takes the guesswork out of bounce rate. Instead of saying “people are leaving immediately, probably because the content wasn’t relevant or the load speed lagged,” GA4 tells you how many people stay on the site long enough to take another action. Before, a low bounce rate could mean someone read your entire blog archive going back to 2007 OR that they had 39 tabs open for a week. Now, you’ll know if a site visitor was idle or active — giving a clearer picture of whether or not they were truly engaged. The bottom line is that GA4 offers a little more accurate view of engagement, but your UA to GA4 bounce rate numbers won’t match up. Your stakeholders are still going to ask about it for a while, and that’s going to be annoying to handle in reporting meetings. Round 2 winner: draw Round 3: Interface If you’re a long-time user of Universal Analytics, you’re probably used to being able to dig into your data as the mood seizes you. Wondering which URLs drove the most traffic last month? Click, click, answer. Want to find out how you got so many hits from Denmark yesterday? Click, click, answer. Puzzled as to why your website suddenly ranks for terms like “can horses drink beer” (or maybe that’s just us)? You get it. The UA interface’s high degree of clickability that enables ad hoc analysis. Not so with Google Analytics 4. When you log into your dashboard, you’ll see a similar view, but all of your clickability is gone. No digging into traffic cohorts. No in-the-moment drill-downs. But all is not lost. If you click the plus sign on a measurement in GA4, you can set up secondary dimensions. So if you want to check which referral URLs are driving traffic, you can build that query in. Maybe you want to set up secondary dimensions to narrow down data here and dig in deeper there. With some time and patience, you can eventually get to a view similar to something you might have seen in UA. But your “click, click, answer” days are over. Sorry about that. Round 3 winner: UA Round 4: Reports We know people who have used Universal Analytics their entire working lives and never built a custom report. If you open your UA account, you’ll find dozens of out-of-the-box reports to meet your needs. While most mid-market to enterprise businesses do have some custom UA reports, they certainly aren’t required to use the tool. Now open your Google Analytics 4 account. Where UA shows you report after report to help you review your website health, traffic, campaigns, networks, landing pages, and so forth, GA4 shows you an overview screen and…two reports. Two. If that prospect leaves you feeling like you got the wind knocked out of you, you’re not alone. Once you get over the inability to migrate your data from UA to GA4, you’ll need to plan to migrate your reports. And that’s going to be a bigger problem for a lot of medium- to large-sized businesses. Take a deep breath and start by cataloging every report you use in your UA account. Map out exactly what you measure, and how the results are displayed. Note how you use the results, and which business units rely on the outputs for operational and strategic decisions. Then, start building custom reports in GA4 to answer those needs. Learn more about building reports in GA4 >> No, it’s not going to be a one-for-one replication. Yes, it’s going to be time-consuming and frustrating. Depending on your timeframe, budget, and bandwidth, you might need to look into partnering with a third party to get this work done right and in time for the switch. On the bright side, though, this process may alert you to reporting you’ve used out of habit that wasn’t exactly right. Building new reports offers you a fresh start, and a chance to really understand what your analytics needs are, and how data can help drive better decisions across the business. Round 4 winner: draw Spoiler alert: GA4 wins Ultimately, the Google Analytics 4 vs Universal Analytics match isn’t a fair fight. Come July 2023, Universal Analytics is out for the count. But what you choose to do in the meantime could mean the difference between emerging victorious or limping out of the ring with broken ribs (proverbially speaking, we hope). Your approach to the GA4 transition could make all the difference. Increase your odds of GA4 success by checking out our (cage-free) resource: 6 steps to a successful Google Analytics 4 migration.
Accelerating digital transformation in finance and banking
From online transactions to mobile payment apps like Venmo, today's consumers increasingly look for digital access to funds — and they expect a seamless experience. As customer expectations and the economy continue to evolve, digital transformation in finance and banking needs to keep pace. Banking culture hasn’t always kept up with what digital customers are actually looking for. The World Banking Report 2021 reported that, “Despite being vocal about improving the customer experience, the banking industry’s delivery of the key components of a strong customer experience, such as improving transparency and social responsibility, improving customer support, and reducing the cost of services, falls far short of customer expectations.” Three common barriers to digital transformation in banking There are several common challenges to digital transformation that keep banks from pivoting quickly to meet customer expectations. Roadblock #1: Technical debt As a highly regulated industry, traditional banking relies on complex and siloed legacy technologies that are often expensive to maintain. Over time, technical investments compound, making it increasingly difficult to find time or resources to shift to more modern or scalable platforms. When banks grow through mergers and acquisitions, attempting to integrate additional legacy systems adds to that technical debt. At the same time, banks face increasing competition from the fintech sector — online-first financial institutions that aren’t encumbered by aging platforms. Traditional banks saddled with technical debt may feel that they lack the time or resources to fully integrate, modernize, or replace their legacy technologies. But the longer this debt persists, the harder it is to compete with digital natives, leaving banks less agile in the marketplace. How platform modernization helped make an annuity organization more competitive >> Roadblock #2: Organization size Like many enterprise-level organizations, larger banks often create internal digital teams that combine business, IT, and marketing capabilities and develop expertise in their own technologies, systems, and processes. Faced with competing internal priorities and hampered by regulatory constraints, these internal teams may struggle to get alignment and prioritization for a banking digital transformation strategy and may lack the breadth of expertise necessary to implement a comprehensive modernization effort. Smaller banks, on the other hand, may be more nimble and successful at shifting internal priorities, but they may not have the resources to staff dedicated teams. While organization size is often called out as a hindrance to effective digital transformation in banking, the underlying problem may not actually be a headcount issue. Regardless of size or industry, most companies miss their digital transformation goals due to lack of clarity and strategy. “Digital transformation” in finance or any sector can be hard to define, implement, and measure. A more strategic approach starts with identifying concrete problems or issues, understanding customer needs, and developing solutions that bridge the gap with action steps that are clear, dynamic, and measurable. How technology strategy comes to life >> Roadblock #3: Relying on assumptions about customer needs and wants Understanding customers’ needs, pain points, and experiences can be difficult, and as users adapt to technology their preferences continue to change. This makes audience research even more critical when defining your bank’s digital transformation strategy. After a surge in remote work due to Covid, comfort levels with technology are at an all-time high. Research from McKinsey found that 75% of people using digital channels for the first time during the pandemic indicate that they will continue to use them when things return to “normal.” Not only are customers more comfortable with banking technology, but it has also become an important factor in choosing which bank to use. According to Mobiquity’s 2021 digital banking report, 40% of respondents agreed that they are likely to switch accounts to get better digital tools. Investing in both qualitative and quantitative data can dispel assumptions about your audience, while also revealing specific ways to improve the customer experience. As those opportunities are identified, banks can prioritize technology and services that will have the biggest impact. What goes into a successful customer experience strategy >> How to approach a digital transformation strategy in banking Given these challenges and the continuous evolution of customer expectations, several technologies offer significant potential gains and can help financial institutions stay competitive. Mobile app enhancements Mobile banking apps typically offer the ability to check balances, transfer funds, pay bills, and chat online with a bank representative. By building applications that go beyond these basic services, banks can increase their new customer base while improving customer retention and lifetime value. Leaders in the banking space now include peer-to-peer payments, lending inquiries, and chatbots as part of their applications. However, in addition to monitoring what competitors are doing, it’s important to implement a robust discovery process to see what the target audience wants from a banking app. This could include developing target personas and performing pain point analysis to find unique solutions and services that will better appeal to customers’ needs. From there, financial institutions are better poised to tackle the next layer of technology for the app space — personalization. Many banks are investing in personal financial management tools and customized product offerings in their apps, making banking more accessible and valuable than before. These user-friendly applications and their customization capabilities are an integral part of digital transformation in banking. Refine your mobile applications and provide a better customer experience >> Machine Learning Historically, machine learning engagements have required substantial data science and model training investments. But major ML platforms have evolved, lowering the barrier of entry for these projects. Now, midsize and even smaller banks can use machine learning models to better understand their customers and drive a more personalized experience. And machine learning isn’t just valuable for deepening current relationships; it can also help banks target and acquire new business by identifying trends and opportunities. This means higher quality leads, improved retention, and an increase in business with more potential for high lifetime value. [On-Demand] Reimagining customer insights, risks, & relationships through machine learning >> Data management strategy Traditional lending institutions underwrite loans by using a system of credit reporting. Banks that process loan applications evaluate the risk by looking at credit scores, homeownership status, and debt-to-income ratios. Today, three major credit bureaus provide this information. But these reports can sometimes contain erroneous information, and the information comes at a high cost since it can only be found in three places. And while banks often collect their own internal data, if that data is incomplete or disorganized it cannot offer useful insight. With structured data management strategies, financial institutions can mitigate losses by generating more data and using it to recognize trends and potential liabilities. See how one bank improved ROI by 1054% through strategic data management >> Robotic Process Automation (RPA) Some banking processes are still highly manual. Consider routine tasks like opening an account or reporting a stolen credit card — it takes time to get through the questions, and usually requires a phone call from the customer. With robotic process automation (RPA), in the case of a stolen credit card, the workflow process can automatically cancel the old card, issue a new card, and confirm the mailing address of the new card. RPA can also identify bots or theft with greater accuracy than a human analyst. RPA even has the potential to assist with workload transformation. In addition to streamlining and automating internal processes, RPA can be used to manage the cloud technologies that institutions rely on for their everyday tasks. This leads to more refined workload placement — and therefore a more productive workforce. The bottom line on digital transformation in banking From highly-personalized service offerings to easy-to-use applications, consumer expectations are high in the banking sphere. To keep up with these expectations, banks must position themselves to adapt quickly. Traditional banks are often at a disadvantage to digital-only competitors. Newcomers operate without the burden of legacy systems and outdated business models. But a digital-first attitude can help financial companies effectively implement technologies that enable digital transformation in banking. Find out how one financial services firm successfully handled digital transformation >> Ready to boost your productivity and customer engagement? Let us know your questions and find out how a strategic approach to digital transformation can help your bank thrive in a digital-first world.
Spot the camels on your digital transformation roadmap
When the idea has legs, but you don't know how to use them In the early 1850s, the U.S. government was looking for ways to move troops and supplies through the American Southwest. Their trusty pack mules were no match for the arid conditions of west Texas, New Mexico, and beyond. Sound like a use case for camels? The government thought so. And when the Army initiated a proof of concept with a small herd of camels and a few camel drivers imported from the Ottoman Empire, the hypothesis proved out. Where mules died of thirst, camels kept moving. Where the U.S. wanted to blaze a trail, the camels not only demonstrated endurance, but also survived rattlesnake bites with no ill effects. The project was such a success that the Camel Corps commander asked Congress for an additional 1,000 camels. Instead, three years after the initial investment, the costly program disbanded, and the camels were auctioned off. Why did the Army take a loss on such a seemingly successful project? The answer seems simple in hindsight: very few American soldiers knew how to lead camel trains. Compelling use case. Promising POC. Project ultimately fails due to unforeseen upskilling and scalability challenges. Sound familiar? To escape that script, successful companies take a strategic approach, including change management, cross-training, and outsourcing in plans for any new technology tool or platform. After all, even the most exotic solutions are only as good as your ability to integrate, scale, and manage them for ongoing success. If you’ve tossed some camels on your digital transformation roadmap, it might be a good idea to stress test your strategy before you bet the ranch. Whether you need to ask a quick question, talk through a specific roadblock, or get a major endeavor unstuck, we can help. Our team of digital, data, and technology experts can, metaphorically at least, help you lasso those camels (watch out, they spit) and deliver transformative results. Get smart: You can read a fictionalized version of the true story of the Camel Corps in Teá Obreht’s novel Inland, which takes on cowboy legends from an unexpected point of view.
There's no crying in baseball. Or in GA4.
Rub some dirt on it, marketers. When a behemoth like Google rolls out a wholesale change like GA4 while the new product is still in beta, it’s understandable if you miss some fly balls or pull a muscle. New reports. New features. The vague sense that you should be grateful for the opportunity while also feeling dumb that you don’t exactly get the nuances. If all this leaves you feeling like you’re getting yelled at by Tom Hanks in 1993’s A League of Their Own, we understand. Universal Analytics won’t be available for much longer, and GA4 is a moving target. But you don’t have to beat yourself up about it. We’ve got a team digging into the transition to help you stay in the game. Get smart. Not sure which play comes next? Need to catch some GA4 basics? Here are a few resources to get you started: [Deep Dive] Build a solid game plan: 6 steps to making your GA4 transition easier [Offer] Get your GA4 migration back on track [30-Min Webinar] Maximize your GA4 opportunities
Is data virtualization the droid you've been looking for?
Unleash the force of use cases If you keep up with the Star Wars canon, you might have some thoughts on droids. In a galaxy far, far away, these robot characters can be useful – flying your spacecraft, babysitting your large-eared infant, loading heavy objects without talking back – but they can also easily be misapplied. One minute you have a friendly new toy, and the next thing you know it’s gone over to the dark side and cut the power to your retractable space dome. Perhaps this reminds you of data solutions you’ve tried. It’s easy to get on board with a new technology idea. Data mesh, data fabric, and data democratization are heady concepts and terrific solutions – in the right context. So how do you harness the power of your data without inadvertently triggering a galactic crisis? You could get Jedi-master-level at wielding a light saber. But it might be faster (and, yes, safer and more realistic) to start with use cases. Maybe you need to connect information across multiple platforms to create a 360° view of your customer Maybe your business users need real-time data integration to gain the operational intelligence that drives better predictions Maybe you’re looking for a data-as-a-service solution to provision your suite of applications In these situations, data virtualization might be part of the droid solution you’ve been looking for. As a technology solution, data virtualization doesn’t stand alone. If you don’t have master data management, governance, and quality processes fully locked in, implementing data virtualization could put your whole operation at risk. If you need help identifying or prioritizing use cases or aren’t sure what steps to take next with your data, let us know. We don’t do robot companions (yet), but our customized data jumpstarts can give you a hyperdrive boost on your path to data maturity. Get smart: If Obi Wan Kenobi (currently streaming on Disney+) hasn’t given you enough time-warp whiplash, you could try reading The Kingdoms by Natasha Pulley. If the book doesn’t underline the importance of empowering your decision-makers with integrated data views, we don’t know what will.
The future of identity & access management
In this case, we're not looking for plot twists. In the gripping mystery thriller film The Invisible Guest, a wealthy tech entrepreneur finds himself entangled in a murder case and hires a top lawyer to build his defense. As the entrepreneur’s story unwinds, the audience is kept guessing until the final identity is revealed, which, we must confess, we did not see coming. Hopefully the same could not be said for your organization’s technology platforms. As more and more data and applications move to the cloud, and employees increasingly expect seamless BYOD experiences, companies are looking for identity and access management solutions that balance flexibility with robust security. Whether you’re looking for a new tool, need to integrate or authenticate across new data streams, or create complex multi-tenancy and role-based access rules, we can help. Our technology and data teams can help you identify and implement the right solutions to keep identity records and access rules secure and seamless – so you can keep your business moving forward. Get smart. No shade intended for superheroes and middle-aged men flying jets, but if you’re looking for a gripping mystery thriller this weekend, we’d recommend The Invisible Guest. It’s currently streaming on Netflix with English subtitles. That’s right, you get culture points and suspense. And if you’ve been wondering about simulation theory and hoping to find a sci-fi outlet to help you think it through, Sea of Tranquility turned out to be better than When We Cease to Understand the World, in our opinion. If, after getting all turned around by cinematic and literary plot twists, you want to explore tech solutions, let us know. We can’t deliver time travel, fix the lighting on your moon colony, or help you prep for a Spanish deposition, but real-world identity and access management solutions are right up our alley.
Digital transformation in healthcare: Improving patient experience, care, and outcomes
As the pace of technological change continues to increase, digital transformation in healthcare often struggles to keep up. Challenges like integrating aging legacy systems, maintaining patient privacy, and leveraging disparate data sources into actionable insights loom large in healthcare, where time and resources are often at a premium. But the same circumstances that make digital transformation in healthcare more difficult are the very things that underline its importance. When patient lives are on the line, digital transformation isn’t just a “nice to have.” Healthcare systems that achieve their digital transformation goals see immediate improvements in patient experience, quality of care, and patient outcomes. From that standpoint, digital transformation in healthcare isn’t just about adding technology, it’s about revolutionizing the processes and systems that drive the health and well-being of the population as a whole. Case study: Life-saving technology in diabetes long-term care >> Putting patients first While individual healthcare providers commonly put their patients’ needs front and center, the system as a whole did not evolve with that mentality. Due to a variety of factors, including payer systems, consolidation, and the regulatory environment, healthcare systems got a reputation for siloed information, duplicate workflows, lack of clarity, and confusion. As healthcare organizations seek to modernize, smart health systems are taking a consumer-centric approach — redesigning patient experiences and pathways while improving care delivery and outcomes using digital technology. Article: Transforming customer engagement in the digital age >> Planning the future of digital transformation in healthcare During the pandemic, industries accelerated digital transformation efforts across the board, and healthcare was no exception. Out of necessity, more medical touchpoints and interactions moved online, from virtual office visits to automated triage to digital paperwork. Now, two years into the new normal, healthcare organizations are taking stock of their progress, appreciating the speed and scale of their efforts, and mapping opportunities for the future. A recent Deloitte study found that 60% of health systems say they are about halfway through their digital transformation journey. In our experience, working with technology innovators and leaders across industries is where things can get messy. Digital transformation is a long game, and organizations often get bogged down at the halfway mark. To keep moving forward and avoid costly wrong turns, healthcare leaders need a fresh vision and renewed roadmap. Evolving digital transformation in healthcare to meet the changing expectations of patients and providers requires a commitment to a digital-first, people-centric approach, but offers great opportunities for continued growth in connection, innovation, and successful outcomes. Based on our experience, we see five key areas where focused efforts can deliver outsized returns for healthcare systems that are mid-way through their digital transformations: 1. Modernize legacy systems to give providers and patients more options While the vast majority of individual healthcare providers and healthcare organizations use an electronic health records (EHR) system, relatively few seamlessly integrate with patient portals. A recent PEW Health Information Technology (HIT) survey found that almost 80% of respondents wanted to access and view their electronic health records through a website, an online portal, a mobile app, or electronically in some other way. Moreover, the same survey highlights a strong desire for their doctors to share information about the patient’s health status. For most healthcare organizations, integrating patient records across practices and within portals is a headache at best. Adding in the other digital interactions that today’s consumers expect — such as automated appointment and prescription workflows, chatbots, pre-filled forms, and instant answers — might seem impossible. Delivering a better patient experience and giving providers greater flexibility with their tools often takes a more strategic view. Rather than layering in more and more technology solutions, smart healthcare organizations take a holistic approach to modernization, creating flexible, modular solutions that give patients and providers more options in the near term while also making future enhancements easier. Case Study: How an AI healthcare company optimized its digital experience >> Article: Modernization challenges and the path forward >> 2. Mitigate risk to build patient trust In addition to technology lag, healthcare systems also struggle to connect patient health information due to regulatory constraints. To maintain HIPAA compliance in the US and GDPR compliance for EU patients, healthcare organizations sometimes limit the very information sharing that would result in higher quality care. To meet patient expectations of data privacy and personal health data security while also delivering on modern expectations for functionality and connectivity, health organizations need to build in best practices for security and governance throughout their technology architecture. While there are myriad ways to approach this issue, a couple of key options deserve consideration: BYOD Policies A 2019 study found that 63% of healthcare organizations sustained a security incident related to unmanaged and IoT devices. Given the rapid acceleration of digital transformation in healthcare since 2020, we suspect that number is much higher today. As healthcare organizations modernize systems and integrate more virtual and IoT solutions into their technology spaces, having a robust and updated BYOD policy becomes more important. Developing a compliant, enforceable strategy is a critical step in your modernization efforts. Case study: Navigating BYOD in a highly regulated industry >> Containerization One way to mitigate risk is to containerize data, workflows, and applications in the cloud. Although the cloud can sometimes get a bad rap for security, a carefully designed strategy puts security first and can prevent any breach from spilling over too far into other parts of your architecture. Article: Maintaining a composable enterprise >> Blockchain Best known in the context of cryptocurrency, blockchain uses a computerized database of transactions to allow secure information exchange without the need of a third party. Applying blockchain technology to the healthcare industry could improve information security management; healthcare data can be communicated and analyzed while preserving privacy and security. Countries like Australia and the UK have started experimenting with blockchain technology to manage medical records and transactions among patients, healthcare providers, and insurance companies. In both examples, decentralized networks of computers handle the blockchain and simultaneously register every transaction to detect conflicting information, keeping records accurate and making them more difficult to hack. Article: Building trust in your data privacy compliance >> 3. Use voice and wearables to enhance patient experience and outcomes Wearable devices and IoT-based health sensors can track a patient’s conditions and activities remotely, from their vital signs and hydration to the onset of a medical crisis event. The data collected can be helpful to healthcare providers and enable them to better guide patient care. Healthcare providers use IoT and wearable data for remote monitoring and preventative care, providing more specific, personalized connections even with lower staff coverage. Machine learning also drives AI-based natural language processing technology in the healthcare space. As more patients become familiar with voice models like Alexa, Siri, and Google Home, healthcare organizations see potential to deploy the technology for tasks like triage and treatment reminders. For example, the UK’s NHS uses voice technology to field common questions, deliver health information, and remind patients to take medication. Case study: Using wearables to improve patient care >> 4. Put data to work for predictive and preventative care Healthcare organizations collect volumes of data but traditionally haven’t used advanced analytics to translate the information into actionable insights. Today’s leading provider systems are exploring how real-time business analytics, predictive analytics, and AI can transform patient experience and how care is delivered. In much the same way that businesses use data analysis to spot trends, forecast consumer behavior, and drive purchasing decisions, healthcare organizations can use the information they collect to understand patient expectations, discover areas of dissatisfaction or waste, and identify opportunities to enhance the overall experience of patients with their facilities. Likewise, providers can use patient data to understand how a unique individual responds to treatment, spot key diagnostic markers, and even predict potential outcomes so that doctors and patients can work together to minimize risk. Article: Data analytics in healthcare settings >> 5. Automate administrative tasks to focus on patient care The growing number of administrative tasks imposed on physicians, their practices, and, by extension, their patients adds unnecessary costs to the health care system. Excessive administrative tasks also divert time and focus away from providing actual care to patients. Tools like Robotic Process Automation (RPA) can help healthcare systems save time and resources in areas such as administration, billing, and human resources — freeing up more time for face-to-face interaction with patients. When it comes to finding the right applications for automation in healthcare, it’s important to keep patient experience at the center of your strategy. Developing a customer-first automation strategy can help create the perfect blend of automated interactions and human interactions that will meet today’s expectations and delight patients rather than frustrate them. Article: Finding the right use cases for automation >> Evolving patient care through digital transformation in healthcare As the digital tools, apps, and resources pioneered during the pandemic continue to evolve, healthcare leaders must continue to push ahead with digital-first, patient-centric investments in technology, integrations, and solutions. Finding the right balance between patient and provider expectations, maintaining compliance, and enhancing patient care requires a mindset that values the patient’s perspective. Ready to take the next step? Get a machine learning jumpstart >> Get a better view of your data analytics maturity >> Refresh your digital transformation roadmap >> Wherever you are on your digital transformation journey, our team of digital, data, and technology experts can help. Ask us your questions about digital transformation in healthcare >>
Managing the need for (data) speed
Who's going maverick with your data? It’s probably not the first Top Gun reference you’ve seen today, but when we think about the perils of Shadow IT for data governance and the importance of handling data democratization well in your organization, we couldn’t think of a better analogy. Today’s knowledge workers need data, and they need it at Mach speed. If your organization funnels requests for data insights through your IT department and they’re experiencing significant flight delays or are overbooked, you can’t blame the business for turning to shadow sources for quicker results. Many organizations turn to self-service BI solutions to manage this problem, but if you don’t manage governance with solid workflows, you can quickly stall out or fall into a tailspin. (We promise we’re going to stop with the flying metaphors. Soon.) We’re finding that a lot of companies aren’t sure exactly where they stand when it comes to managing data democratization. You might need a complete governance overhaul, but, more likely, a quick assessment and gameplan could get you back on track. Whether you need an audit, stakeholder workshop, or a quick call to level set, we’re here to help. Shadow IT, data governance, that AI they built to deliver Val Kilmer’s line…let us know what’s on your mind. Get smart: Not to be painfully obvious, but the new Top Gun: Maverick remake is out in theaters nationwide. The original is on Amazon Prime as of June 1. And if you really want to geek out, you can read the 1983 article that inspired the franchise.
Speed to value delivery: Optimizing with data virtualization and data fabric
This article originally appeared in CDO magazine. Data and analytics have long held promise in helping organizations deliver greater value across the entire stakeholder landscape, including customers, associates, and partners. However, since the beginning of the data warehousing and BI movement, achieving business value rapidly — in alignment with windows of opportunity — has proven elusive. For an organization to be competitive in the era of digital transformation, data must be front and center — and accessible in near real-time. But many organizations are struggling with data that is deeply buried, complex to access, difficult to integrate, and inaccessible to business users. Problems like these diminish the value of your data and its ability to inform decision-making at all levels. For most organizations, it’s hard to produce value from data quickly The main challenge has been the distributive and siloed nature of the data subjects that need to be integrated to achieve business-relevant insights. Data subjects — customers, products, orders, warehouses, etc. — typically reside in different systems/databases, requiring extraction, transformation, and loading into a common database where analytics can be mounted. Often, data delivery solutions like data warehouses, self-service BI, and data lakes are used to try and unlock these data silos; however, each of these solutions presents drawbacks in terms of effort, complexity, cost, and time-to-market. That is where data virtualization comes in and delivers a holistic view of information to business users across all source systems. So what exactly is data virtualization? In its simplest form, data virtualization allows an organization to attach to its data subjects where they reside in real-time. It presents disparate data subjects through a semantic layer that enables them to be integrated on the fly to support query and analytic use cases. By eliminating the need to design and build complex routines that move data from multiple source locations into a single integrated data warehouse, products like Denodo enable organizations to compress weeks to months of data preparation time out of the idea-to-execution value stream. As a result, value delivery is significantly accelerated. Learn more about how Fusion & Denodo can help you streamline data access to support your most critical business needs >> Optimization with data fabric. While data virtualization integrates data from different sources into one layer to provide real-time access, data fabric is an approach with end-to-end architecture that allows organizations to manage massive amounts of data in different places and automates the integration process. The thing about data fabric is that it has a huge job to do and must have a robust integration backbone to do it. A data fabric must support many data sources, be compatible with several data pipeline workflows, support automated data orchestration, empower various kinds of data consumers, and more. To do this successfully, a data fabric requires powerful technologies and a solid data integration layer to access all data assets. Many in the data community believe that you must choose data virtualization OR data fabric, but that is not the case — and that solid data integration layer is an example of why. The reality is that data fabric can be operationalized through data virtualization and optimizes your modern data architecture, allowing you to move with the speed of your business. By building a model that utilizes both concepts, businesses make finding, interpreting, and using their data near seamless. Technology by itself isn’t the answer. Even with the proven results of this class of technologies, many organizations continue to struggle with traditional data management and analytic architectures and solutions. This inability to adopt new approaches for data management and analytics only serves to deprive decision makers of rapid access to insights that are necessary to support agility in the pandemic-induced, rapidly transforming digital/global economy. The solution is not just found in technology. Instead, it is found in the minds of the humans responsible for delivering data management and analytic capabilities. It is a human change management problem we face. Remember the adage, people/process/data/technology? The next frontier to be conquered is optimizing the thinking and innovation risk tolerance of stewards of data management and analytics solutions within organizations. What do you think? Is your organization facing any of these issues or trying to tackle how to deliver significant value — better, faster, cheaper, smarter? I’m happy to chat about where you’re at and how to get where you would like to be. If you want to talk, send me a note.
Market like an astronaut
Contingency planning isn’t crying wolf In a 1996 SNL skit, comedian Dana Carvey impersonated a news anchor pre-recording announcements of things that might happen while he was on an extended vacation. The scenarios got more and more ridiculous. What if a former U.S. President were mauled by wolves? Well, as the off-camera producer pointed out, it happened to President Taft. That’s why we plan for contingencies. Astronauts do a similar type of preparation before space travel. Rather than filming sober announcements for newscasts, the astronauts perform simulated scenarios of everything that could possibly go wrong in space. The goal is to build muscle memory so that when something unanticipated goes wrong – and it will – the astronauts can keep their stress in check and work on solutions. As former NASA rocket scientist Ozan Varol says, “Astronauts maintain their calm not because they have superhuman nerves. It’s because they have mastered the art of using knowledge to reduce uncertainty.” Lest you think that marketing “isn’t rocket science,” Varol goes on to explain how the same principles of problem solving used (or catastrophically misused) in the space program also drive performance in more pedestrian forms of work and life. Does your team have the knowledge to handle unexpected pivots? After all, even the most comprehensive plans can be impacted by unforeseen circumstances, random events, or the speed of change in today’s economy. Modern problem-solving demands trustworthy insights. That’s why when we talk about the importance of getting your GA4 instance going by mid-July to preserve year-over-year data, we aren’t just crying wolf. Whether you’re building your marketing program slowly or aiming for the stars, rocket scientists, U.S. Presidents, and comedians agree that the key to finding solutions in the midst of uncertainty is the ability to find and apply knowledge in any contingency. Get smart: In his book Think Like a Rocket Scientist: Simple Strategies You Can Use to Make Giant Leaps in Work and Life, Ozan Varol writes about lessons he learned working on NASA’s Mars program and how we can change our thinking to make better decisions in any job or circumstance. It’s a pretty fascinating look at the space program, and also fairly useful as a self-help/business read. And whether or not you’re into summer reading, take a few minutes between calls to watch the Dana Carvey news anchor sketch. It’s a classic.
Reimagining the skillset supply chain
The pace of change and unpredictable circumstances of the past couple of years have led many companies to rethink their just-in-time approaches to resourcing tangible goods and materials. But why stop there? To scale and adapt fast, companies also need a new approach to how they resource skillsets. One of our clients, PRECISIONxtract, did just that. By taking a just-in-time approach to their shifting skillset needs, the company was able to scale up fast — and minimize risk — in a changing business environment. A right-fit-first approach PRECISIONxtract’s transformative healthcare market access solutions offer patients and providers unprecedented connection to the right medication and resources in clinical settings. To bring that vision to life, PRECISION could have found a series of single-skill vendors or taken the time to recruit and onboard new employees. Instead, they looked for a cross-functional partner that would be a seamless fit with their company culture and that had the right mix of scalable skills. They found that fit with Fusion Alliance. Fusion quickly became an integral part of PRECISION’s team, assembling a group of more than 20 strategy, data, and technology experts to deliver responsive support for a growing set of initiatives. Boosting surge capacity across disciplines Knowing that their flagship product, Access Genius, needed design and functionality upgrades, PRECISION called on Fusion to assess and modernize the application without disrupting the existing business. To avoid downtime and increase speed to market, our team used an Agile process and a model-driven design, in which models from the source code informed modernization efforts. Streamlining the overall architecture not only saved development time, but also made Access Genius easier to deploy to PRECISION’s clients. And, to make the product easier to maintain and cheaper to run, we applied containerization through a microservices model and moved Access Genius to a distributed cloud hosting framework. Our solution provided real-time customer insights that were delivered across a variety of digital channels, in lieu of a people-driven process. This helped take Access Genius: From a complex, cumbersome, legacy monolith into a lightning-fast, distributed, cost-effective, cloud-native solution From a user-driven, database-centric format to a distributed API-based framework, enabling immediate data updates for important cost and coverage changes From a time-intensive customer engagement portal to an intuitive, streamlined, automated process Equipped with a modern, stable, extensible platform, PRECISION was free to explore opportunities for more radical innovation. Disrupting the market with frictionless access to timely data Although Access Genius successfully broke down barriers with data, the solution’s interface required users to navigate a complex dashboard with manual clicks and drop-downs. For pharma teams with limited time to connect doctors to information, seconds count. Working with PRECISION’s product team, Fusion technology experts analyzed the friction point of manual navigation and explored ways to make Access Genius more seamless for the user. Drawing on deep expertise deploying cutting-edge technologies into highly regulated spaces, Fusion suggested exploring a shift away from a traditional web-based interface to an AI-enabled voice functionality that would connect users to the most relevant data and messaging right in the flow of conversation. Changing the way pharma enablement tools go to market At the same time, other Fusion consultants were hard at work rethinking the way PRECISION’s products reached, empowered, and retained customers. We brought in a range of specialists to bring new strategies to life, including: Instructional designers and training developers created an interactive training platform to equip pharma sales reps with greater confidence in provider interactions by deepening their understanding of the Access Genius tool. RESULT: Access Genius IQ, a new training tool that helps PRECISION customers see faster ROI for their Access Genius investment Brand experts, visual designers, content strategists, and web developers elevated visual brand elements and created websites, editorial content, and outreach campaigns. RESULT: New website architecture, design, and content; long-form lead generation content; prospect cultivation email marketing Digital marketing strategists, creative designers, and ad teams implemented innovative ad campaigns in rapid succession as PRECISION had more time to develop and roll out new products. RESULT: LinkedIn ad campaigns generating 3X leads, including 100 qualified leads in the first 90 days Read more about the success of Fusion’s marketing partnership with PRECISION >> Reimagining the skillset supply chain Partnering with Fusion gives PRECISION access to a huge team of experienced consultants with a wide range of skillsets — allowing the company to surge and scale as their business needs and market realities shift. With Fusion bringing in the right people at just the right time, PRECISION saves valuable time and resources, enabling them to be more innovative, more agile, and more impactful for their customers, healthcare providers, and patients. Ready to explore how Fusion skillsets can help your team succeed? Our ongoing work with PRECISIONxtract is just one example of how we help companies build momentum for a digital-first world. We bring big-picture thinkers, technology-minded creatives, data scientists, and technical experts to work alongside our clients, providing a force-multiplying effect that leads to scalable, future-focused solutions for the most complex challenges. Ready to get started? Let’s talk.
Level up your data management
On the journey from data to analysis to insight, companies are shifting from a traditional approach and leaping forward into new ways of delivering actionable business intelligence. While the core goals remain the same — enabling data-driven decisions, optimizing cost efficiencies, and driving revenue growth — new tactics demand new skills. The pivot to a use-case model and good governance throughout the data lifecycle meets those challenges while also delivering faster time to insight. Connecting the business to fit-for-purpose data The new data mindset is purpose-driven. Based on specific use-cases generated by the business, today’s data teams build, deploy, and configure purpose-built data assets that meet the organization’s needs fast. This streamlined process represents a significant shift from the status quo for traditional data teams, but the streamlined workflow pays off. To generate fit-for-purpose data, start here: Solicit use cases from the business Understand and analyze the characteristics and dynamics of the use case Assess your existing data portfolio and identify information that might meet the need Consider the appropriate technology to synthesize data sets and deliver actionable insights Establishing a data asset creation workflow pays off in efficiency and value for IT and the business units involved. Learn more about how to develop data as an asset >> Speeding time to insight Traditional data warehousing models impose a high cost for integrating disparate data sets. A legacy workflow might include: Amending data architecture Creating a semantic model Time-consuming extract, transform, and load (ETL) processes for all data sets involved Preparing the data Making the data available for analysis Today’s businesses don’t have that long to wait for insights. Modern data technologies like Hadoop make it possible to stage data in a platform for immediate access. To structure your data and technology architecture toward a use-case driven model that fosters speed to insight key considerations include: A prioritized list of problems that need solutions Any characteristics or constraints that might impact time to value Available data assets and technologies, like data virtualization, that would enable you to access and analyze data in place Once you adopt methods for analyzing data in place, your team can deliver value on a much shorter timeline. Learn more about data architecture and integration >> Improving data literacy The demand for data-driven insights continues to accelerate. Companies at the forefront of the shift from volume to velocity use analytics pervasively throughout their organization and have the technology and agility to act on insights quickly. To become a competitive, speed-driven organization, your business must excel throughout the analytics lifecycle: Acquire: Harvest data quickly by exploring evolving big data technologies and optimizing first-party data strategies Analyze: Identify the most impactful insights Act: Implement the insights iteratively and strategically That final step involves your organization’s data literacy. Providing insights is one thing, but training your people to take the next right action on the data they see might require new skills. Upskilling the workforce to better understand and use data pays off richly in transformative accuracy, speed, and confidence. Learn more about building data literacy within your organization >> Implementing harvest-to-delivery data governance As the volume of available data continues to increase, businesses are building complementary abilities to understand and use it. But implementing tools and technology to harvest, integrate, and analyze data without robust governance frameworks opens companies up to significant risk. Building strong governance into your data asset creation and management workflows from the start can help. Learn more about how to implement good data governance >> Elevating data leaders As the world becomes more digital, and more customer behaviors move to a mobile context, businesses are changing to meet and match digital footprints with geospatial dimensions. Leadership must keep pace The need for a Chief Data Officer (CDO) at the table isn’t really a question anymore. Today, leading companies are asking where analytics and digital belong in the leadership playbook. To get the most value out of your data management, the right team members — with the right support and authority in place — could not be more important. Learn more about the importance of empowering your CDO >> Data management can be complex. A strategic viewpoint can help. Find out more about Fusion’s approach to strategic data management, or ask us your questions. Wherever you are on your data journey, we can help you keep moving forward.
Fuse Highlights: Coconuts, horses, and beer, oh my!
Every few weeks, we share insights with our Fuse subscribers along with news and trends we’re following across the web, including book recommendations. Here’s a compilation of some of our key insights from last six weeks. If you want content like this delivered directly to your inbox, we’ve got you covered. Subscribe to the Fuse here. Data is the Holy Grail In the classic film Monty Python and the Holy Grail, viewers hear King Arthur and his trusty servant Patsy approaching with a trademark “clip-clop, clip-clop” sound. When the duo emerges from the primordial mist, you see (spoiler alert) that the source of all this noise is not, as might be supposed, a horse. Rather, Patsy is banging two coconut shells together as the king trots about on his own two legs. The duo is getting from point A to point B in their quest, but not in the most efficient or effective way possible. Many companies follow that script. Equipped with buzzword mandates like process optimization and data-driven decision making, it’s all too easy to make small adjustments that sound like you’re headed in the right direction but aren’t necessarily getting you there any faster. How do you drop the coconuts and get on the horse (metaphorically speaking)? What does it look like to use data to drive optimization in real terms? We’ve got our eye on digital twins. Before you run away (how’s that for a deep cut Monty Python reference?) from yet another data buzzword, it’s worth another look at this practical application of machine learning and data analytics. Digital twins are most often used to optimize physical assets and processes like manufacturing, warehousing, and logistics. Using sensors to collect data on a product, machine, or physical process, the digital twin feeds real-time data to a machine learning algorithm to test variables and scenarios faster — ultimately leading to actionable process improvement insights. These days, we’re starting to see more businesses use digital twin frameworks to optimize and innovate non-physical business processes like accounting, HR, and marketing as well. A digital twin simulation can help you surface interdependencies and inefficiencies that might otherwise be blind spots, especially if they’re baked into your business culture as “the way we’ve always done it.” In the quest for digital transformation, don’t settle for coconuts. Instead, let’s talk about the ways your data can carry more of the weight for you. Get smart: If all this talk of Monty Python and the Holy Grail puts you in the mood for an old-school movie night, good news: it’s available on Netflix. And if you’re looking for a more literary scratch for your Middle Ages (ish) itch, we’re reading Cathedral by Ben Hopkins. It’s a fascinating look at the complex processes involved in constructing architectural marvels in the days before edge computing. We may handle optimization differently now, but human nature stays the same. Read the full Fuse: Data for April here. A horse is a horse, and other martech myths Martech is a crowded field, and a lot of the voices weighing in on your options have a horse in the race.* No one is out to skew the odds on purpose, but your organization is unique. Just because one solution is a front-runner doesn’t necessarily mean it’s a great fit for your business. So how do you sort the facts from the hype and decide where to place your bets? We rounded up a few martech myths as a starting point. Myth: One CDP is as good as another. Fact Check: Finding the right CDP (or CRM, or DMP, or any other solution you can think of) isn’t a simple box to check. And, once you make your decision, integrating and customizing your platform will also take time and attention. Myth: Everyone needs a CDP. Fact Check: Depending on your use cases, you might be able to do everything you need to do within your current tech stack. Myth: You should pick a platform and go all in. Fact Check: It probably goes without saying, but when it comes to technology there are no one-size-fits-all solutions. One product might be a great fit for your needs, but that doesn’t mean that vendor should supply your entire tech stack. Myth: Data and tech silos are just the way business works. Fact Check: Regardless of size, scope, or industry, today’s businesses can’t afford to be siloed. When you’re evaluating a tech solution or rethinking your entire customer data strategy, prioritizing integration is always a safe bet. On your mark. Ready to put your martech through the paces? Read on to find resources to help you optimize your stack and get your customer data strategy across the finish line. *Full disclosure: one of our team members won $100 when Rich Strike won the Kentucky Derby, but for the most part we are platform- and livestock-agnostic. Get Smart: We try not to be too on the nose with our book recommendations but couldn’t help ourselves this time. In Data Strategy, Bernard Marr collects a solid primer on the data landscape and how your organization can use it (legally and ethically) to advance your goals. Spoiler alert, though probably not surprising given the title, strategy turns out to be the foundational driver for effective data use. Whether you read the book, our Ultimate Guide to Customer Data Strategy, or just want to get a sense of potential next steps, we’d love to chat about customer data. Grab a time that works for you. Read the full Fuse: Marketing for May here. Three technology strategies walk into a bar If you’ve got a monolithic legacy system on your hands, sticking with the status quo isn’t a fun choice. But going nuclear and building back from scratch probably isn’t realistic. Wouldn’t it be great to find a middle ground? Meet the composable enterprise. It’s an iterative path toward digital transformation, with applications repackaged into components that can be used to build new solutions across the business. Piece by modular piece, you rebuild your technology ecosystem — becoming more efficient, effective, and scalable as you go. As the glue that holds those components together, APIs are key to building a composable business. And developing secure API solutions that accommodate shifting capacity demands and amplify your technology takes a hefty dose of strategy and expertise. That’s what we love about it! If APIs are your jam, too, or if you’re wondering if a composable system makes sense for your business, let’s talk. We’re talking APIs over IPAs in Cincinnati on June 16 and you’re invited. It’ll be fun! Get smart: You’ve probably spent the day wondering how speculative/sci-fi/literary fiction relates to API strategy and microservices (or maybe that’s just us). But, we’d guess, the same type of mind that enjoys transforming legacy monoliths into composable enterprises would also really track with a book like How High We Go in the Dark by Sequoia Nagamatsu. Modular pieces linked together by strong bonds leading to an intricate and ever-expanding whole? We’re here for it (the book and the technology strategy). Read the full Fuse: Technology for May here.
What is data democratization, and how do we manage it responsibly?
Traditionally, IT experts created data assets for business users upon request. Of course, people still went rogue, capturing, analyzing, and visualizing data in Excel “spreadmarts” outside of IT, but their potential for damage was limited. Today, as next-generation business intelligence (BI) tools become increasingly powerful and self-service enabled, and as global privacy laws and regulatory requirements increase, businesses without strong data management and governance programs face much greater risk. What is data democratization, and how can your business ensure that self-service data asset development doesn’t trigger chaotic — and costly — consequences? Data management best practices can help you: Keep up with the pace of information needs outside of IT without spawning ungoverned “Shadow IT” practices Manage existing Shadow IT practices, particularly if your organization adds substantially more powerful BI tools to the mix Develop a more open data culture while also valuing privacy, security, and good governance The solution lies in finding the right balance between increasing demands for data governance and the rapidly escalating need for data access. What causes shadow IT — and why it can be dangerous Growing demands for data-driven insights accelerate the demand for knowledge workers to get information when they need it and in a format they can use. As these requests for data insights balloon, IT departments quickly get backlogged. To solve the problem, businesses sometimes turn to self-service data tools, particularly in the BI space. These tools reduce repetitive demands on IT time while enabling users to personalize how they access and view data in their own channels. Tools like Tableau and Alteryx provide rich data visualization, which further speeds time to insight. Learn more about business intelligence (BI) options >> While data preparation used to require highly technical skills and toolsets to extract, transform, and load information and generate reporting, data democratization puts significantly more power in the hands of average business users. Business users can now do work that the savviest Excel-wielding shadow IT veteran never dreamed of. Flattening XML, geo-spatial custom polygon prep, blending, cleansing, creating predictive models, regressions, neural networks, and Naïve Bayes classifiers can be created and used without any traditional IT development knowledge. But data democratization has a dark side. Businesses can get into trouble when data democratization isn’t closely paired with data governance and management. Without a carefully cultivated data culture that understands data governance and management, shadow IT’s ballistic upgrade puts businesses at risk. Data management best practices for risk mitigation As data democratization becomes more of a reality in your organization, data management migrates from your IT and security teams to every business unit. Implementing data management across the business requires clear communication and leadership commitment. Audit your information ecosystem First, take stock of your current state in terms of data intake, preparation, access, and use. Take a fair and honest look at the data management practices and acknowledge where pockets of shadow IT exist. While Excel is obviously ubiquitous, understanding who has licenses for some of the newer tools, like Alteryx, may be a good place to start. When pockets are identified, ask some fundamental questions, like: What information is the business craving? Which tools or solutions have they tried? How are these tools being used? Is this the best tool or multi-tool solution for the job? Is there any overlap or duplication of assets across the business? What assets have they developed that could benefit a larger group or even the enterprise? Shift your data management mindset Then, resist the temptation to scold. The historical data management mindset toward those who created these one-off information stores needs to be turned on its head to focus on problems and solutions rather than reprimands. In light of more useful one-off data storage, you may find it hard to rationalize all of your current IT-generated assets. The cost to maintain them, particularly if they’re not actually being used, makes them liabilities, not assets. Taking the time to define what the business needs, and then collaborating on the process, information requirements, tooling, and, potentially, infrastructure and architecture solutions that would best meet and scale to fit those requirements is a far healthier approach. Then, your company not only creates a self-service machine that can keep pace with demand, but also goes a long way toward building a healthy data culture. How to build a strong data culture >> Get clear on good governance The term governance gets thrown around a lot, but does your organization have a clear idea of what you mean by it when it comes to your data? It’s not enough for IT to have documented policies and controls. A mature governance program must be seated in the business. Once again, effective processes begin with business requirements. While IT may bear responsibility for implementing the actual controls to provide row-level security or perspectives, the business must provide definitions quality rules, lineage, and information to inform and support governed access. In this sense, IT becomes the stewards responsible for ensuring those business-driven governance requirements are met. As your organization progresses toward data democratization, keep the following best practices in mind: Establish processes and workflows to bring democratized data and data assets under governance efficiently Co-create governance rules and standards with business units, and be sure they are communicated clearly to all data users Maintain governance requirements, quality rules, and access architectures that make data and data assets suitable and consumable by others within the organization How data governance fits into strategic data management >> Build a bridge between democracy and governance Although bringing the creation and persistence of data assets into the controlled IT fold is critical for good governance, allowing the business to quickly and freely blend, experiment, and discover the most effective fit-for-purpose data sets for their information needs takes the burden off of IT to try to figure out what the business needs. How do mature data organizations bridge the gap between democratized data and good governance? Workflows. Workflows bring democratized asset development and IT-implemented controls together. A strong data workflow, including how requests are processed, prioritized, reviewed, and either approved or rejected, is the critical gatekeeper that prevents democratization from turning into chaos. Your workflow should address: Data submission: Workflow is the process established by which data assets are submitted for enterprise or departmental consideration as governed assets and persisted according to IT’s self-service standards. Identifying the roles, process (inputs, outputs, gates), and relevant governance structure is fundamental to get a meaningful workflow in place. Data request backlog: Not every one-off dataset is an asset – the validity of the data produced must be verified to examine the lineage of the data and any transformation logic (e.g., joins, calculations) that was used in its creation. Data scoring: The usefulness of the data must be scored or assessed in some objective way to determine if it should be published and to whom. Data access and security: The workflow process should also address access and security requirements. By streamlining the information demand management process and making it more efficient, your IT team can shift focus to providing higher-value data and information for the business, while potentially driving down cost by retiring the production of lower-value reports or marts. Learn more about how to manage data as an asset >> Manage change well Shadow IT is called that for a reason. To get those datasets and those who create them to willingly step into the light is a culture shift that requires effective change management and clear communication. Creating an environment that encourages the creation of self-service and democratized data asset development by the business is important, but, when unchecked, can result in the proliferation of potentially redundant or conflicting data sources, none of which are under IT’s purview. Responsible development and management of all data assets within the organization requires balance, oversight, and commitment to change. Democratizing data holds huge potential for your business when it’s well managed and governed. Not sure where your company stands? Maybe a quick assessment could help. Our team of data experts can help you get clarity with a customized consultation, workshop, or audit designed to fit your needs. Let us know what’s on your mind >> Learn more about data strategy and how to get started >>
How to put AI to work for your business
While artificial intelligence (AI) continues to linger in popular imagination in the form of humanoid robots, in real life AI more often exists as a process enabler. Over the past several years, as costs democratized the technology, AI and related emerging technologies like machine learning (ML) and deep learning (DL) became more accessible to mid-market companies. Today, most businesses use AI in one capacity or another — streamlining work, minimizing risk, and gaining competitive insights. These innovations are more than buzzwords. They have powerful potential to revolutionize the way your business collects, processes, and acts on data to solve the real problems facing your business. AI, ML, and DL in the business context To find the right AI applications for your business, it helps to understand your options. Artificial Intelligence Machine Learning Deep Learning Definition Machines programmed to be “smart” Machines that learn from experience provided by data and algorithms ML applied to larger data sets and using multi-layered artificial neural networks Common Examples Smartphones, chatbots, virtual assistants Spam filters, online purchasing recommendations Alexa, Google translate, facial recognition, self-driving cars Example Use Case Configuring a CMS to deliver personalized website experiences using available data points Discovering patterns in data such as “customers who buy X also buy Y,” purchasing cart analysis Processing a large volume of unstructured data, such as images or voice recordings, to generate insights Limitations Machine can only act on specific rules provided Humans must input data parameters as a starting point Requires very powerful – and expensive – computational resources How machine learning differs from AI “ML is the science of getting computers to act without being explicitly programmed.” Stanford University Machine learning takes a different approach to developing artificial intelligence. Instead of hand-coding a specific set of rules to accomplish a particular task, ML trains the machine using large amounts of data and algorithms that give it the ability to learn how to perform a task. Over the years, algorithmic approaches within ML evolved from decision tree learning, inductive logic programming, linear or logistic regressions, clustering, reinforcement learning, and Bayesian networks. Currently, machine learning uses three general models: Supervised learning: Humans supply factors until the machine can accurately apply the distinctions (for example, defining what counts as spam to a filter). Unsupervised learning: The system trains itself on provided data, which is used to surface unknown patterns, as in clustering and association. Clustering looks for patterns of demographics in data and how they predict one another, as in targeting groups of customers with products they will likely need. Association uncovers rules that describe data, as in online book or movie recommendations based on previous purchases and purchasing-cart predictions. Reinforcement learning: Using complex algorithms, the system learns through trial and error toward a defined “reward” of success. Cycling quickly through mistakes or near mistakes, the machine adjusts the weight of the previous results against the desired outcome. How deep learning works As another method of statistical learning that extracts features or attributes from raw data sets, deep learning builds on ML frameworks. While ML requires humans to provide desired features manually, DL uses even more complex algorithms and achieves more sophisticated results without human input. Deep learning algorithms automatically extract features for classification. This ability requires a huge amount of data to train the algorithms and ensure accurate results. To process this volume of data, DL requires specially designed, usually cloud-based computers with high-performance CPUs or GPUs. Using multi-layered artificial neural networks inspired by the biology of the human brain — specifically the organic interconnections between neurons — deep learning trains artificial neurons to identify patterns in information to produce the desired output. Unlike the human brain, artificial neural networks operate via discrete layers, connections, and directions of data propagation. Three common types of artificial neural networks and DL processing applications are: Convolutional neural networks (CNN) are deep artificial neural networks that are used to classify images, cluster them by similarity, and perform object recognition. These algorithms navigate self-driving cars and enable facial recognition, but are also used in leading-edge medical applications such as identifying tumor types. Generative adversarial networks (GAN) are composed of two neural networks: a generative network and a discriminative network. While GANs can be used negatively as in the creation of “deep fake” photos and video, organizations can also use GANs to create privacy-safe data pools for ML. Natural language processing (NLP) is the ability to analyze, understand, and generate human language, whether text or speech. Alexa, Siri, Cortana, and Google Assistant all use NLP engines, and many businesses are exploring ways to incorporate voice into their proprietary applications and digital solutions. Make smart decisions about AI Fusion Alliance provides cloud infrastructure and emerging technology solutions that accelerate your digital transformation. Our teams help businesses across a wide variety of industries uncover the best use cases for AI, and the right emerging technology solutions to meet your goals. We can help you source, clean, and integrate your data, build and train machine learning models, and iteratively test and improve your solution to maximize results. Not sure how this might work for your business? Check out these real-world examples: Find out how machine learning helps a national pizza chain retain customers >> Discover how AI transforms business processes >> Explore the future of wearables and mobile ML technology >> Learn how ML can help businesses predict sales pipelines >>
How to choose the best CMS
A content management system (CMS) is the tool that companies use to manage and display content marketing. Depending on where you are in your digital transformation — and, more specifically, in your digital marketing journey — you might think of a content management system (CMS) as the place you put the copy for your website or the platform you use to launch email campaigns. If you’re like most people, you think of a CMS in narrow terms of what you’ve seen one do before. That view might have served your company well for years. It might work today. But, as you think about the ways your company might scale or shift over the next several years, in a digital world that changes more and more rapidly, it might be worth giving your CMS another look. Why your CMS matters >> Choosing a CMS: 8 things to bear in mind Whether you’re in the market for a new CMS to support an upcoming initiative, or you simply want to make sure your current solution is still serving your goals, we’ve identified eight key factors to think through as you identify the best CMS. 1. Content complexity Because the primary job of your CMS is handling your content, be sure that the platform you’re using or choosing can handle the complexity of your content landscape — both current state and where you’re headed in the near-term. Part of the CMS selection process includes envisioning where you want to be in the future. Technology advances fast, and customer expectations are not far behind. As your target audiences incorporate technologies like wearables, voice assistants, and AI-enabled interfaces, your content may need to evolve to reach them. While your content may not be complex today, will that direction remain sustainable for your business? If your content primarily exists on a single website, you may be able to use a simpler CMS. But if you use content across multiple interrelated sites, reuse content across digital properties, or support video, audio, or other media, you might need a more robust system. Planning ahead for a flexible, scalable solution may make more sense in the long run. Get help with your content marketing strategy >> 2. Storage needs Another factor to consider when choosing a CMS is storage. Once again, the types of content you need to support and the diversity of platforms where you need to display it play a critical role in this decision. If your internal policies or regulatory requirements require the CMS to be your system of record for auditing purposes, your solution will need more capacity than if storing legacy content archives elsewhere is a possibility. Depending on use cases, volume, and user experience needs, some media-rich content such as videos could be stored on an external channel like YouTube or Vimeo. Be sure to get internal input before making that decision, however, as UX for embedded video may be compromised depending on how your digital properties are structured. For organizations that rely on printable content, such as brochures, one-pagers, and other PDF content, interfacing a document management system with the CMS might make more sense than storing a large content library directly in the CMS. Storage needs can also be influenced by your organization’s cloud strategy. Where your business intends to host the CMS — on-prem, in the cloud, or with a SaaS provider — matters when it comes to your storage decisions. Cut through cloud complexity >> 3. Workflow automation Regardless of the size of your IT and marketing teams, smart workflow automation can be a force multiplier. As you evaluate your current CMS and move toward choosing the right CMS for your business, think through how that tool may impact existing workflows. If your content creation, editing, review, and publication process has many steps, automating some parts of the workflow probably makes sense. Some CMS solutions can handle step-by-step hand-offs, which frees your team from the need to shepherd pieces through to publication — and eliminates the risk of a task being dropped or forgotten along the way. 4. Ease of use When it comes to CMS selection, functionality requirements balance against ease of use. If you don’t have on-team resources who can handle in-the-moment development or ongoing maintenance, you might need to opt for more of a What You See Is What You Get (WYSIWYG) content editor so that your marketing team has the flexibility to create needed assets in a timely manner. To weigh your needs for self-service content creation, think about which team members might be entering, editing, and approving content, and what skillsets might be required to use and maintain your chosen CMS. How your team is structured, existing competencies, and the ability to upskill or cross-skill those team members can also be important factors in your decision. 5. Ease of integration While there may be an edge case or two of a CMS operating as a sole solution within a business, chances are you need one that can function as part of an existing or evolving tech stack. As you choose the right CMS for your business, take the time to map out all of the other marketing, sales, project management, and other business-critical technologies you’ll need it to interface with. Some examples of tools you’ll need to interface with a CMS might include: Customer relationship management (CRM) Email systems Social media marketing Marketing automation Project management Forms Analytics and data dashboards Most of these tools will either send data to or receive data from your CMS, so once you’ve identified the tools, systems, and platforms that need to connect with your CMS, you’ll need to determine if APIs are available or if you’ll need custom API development. Different CMS solutions offer different options for built-in integrations, and some make it easier to customize integrations than others. Find out more about API strategy>> 6. Amount of customization required Customization concerns extend beyond API options to encompass everything from the user interface, workflows, security, and functionality of your CMS. When it comes to choosing the best CMS, customization is a Goldilocks evaluation. You don’t want too little or too much. If your CMS requires significant customization or workarounds to achieve basic customization, you may be introducing too much risk. It might be easier to find a CMS that more closely fits your needs. On the other hand, if your CMS restricts code access and doesn’t allow enough customization, you could find yourself locked into a solution that holds you back when you need to scale or change. Instead, look for a CMS that hits the “just right” balance between meeting your needs and allowing creativity and control over the customer experience. 7. Costs When it comes to technology, the sticker price rarely equals the actual cost of the solution. To get a more accurate view of cost when you’re choosing a CMS, factor in: Ongoing subscriptions if you choose a SaaS CMS Licensing fees if you choose a proprietary CMS software External vendor time if you choose an open-source CMS and don’t have a dedicated development and maintenance team Internal team efficiency and infrastructure if you choose to handle maintenance, security, and development in-house 8. Service Related to overall cost throughout your CMS lifecycle, remember to think through your service requirements — both in terms of implementation or upgrade and the ongoing support you’ll need. Your CMS vendor may be a great partner for implementation or might suggest that you choose a third-party implementation team. As you think about implementing or upgrading a CMS, also consider: What kind of data migration support you need The level and duration of support you’ll require for the shift If you need help with cut-over planning The time and resources you’ll need for system testing Training needs On-site support Choosing a CMS: next steps As you consider how to choose the best CMS for your business, remember that well-defined needs make for better-fitting solutions, and don’t try to tackle the requirements, system audit, and vendor selection process on your own. Getting key stakeholders involved early on in the process can help you make better decisions and give you the perspective you need to choose the right CMS. At a minimum, that group should include leaders from IT, marketing, and sales, but you may also want to include customer service, data, and key business units as well. If you have the right people at the table, a half-day workshop could be enough to set your project up for success. Key points to cover in that sort of meeting include: Getting a big picture view of how the CMS will fit into your existing technology ecosystem Aligning goals and requirements across the business Outlining a budget Designating project owners Agreeing on how to prioritize requirements in light of available resources Next, your project team can evaluate potential solutions against the agreed-upon criteria. You might use your organization’s vendor selection matrix, or develop a new one to fit the project. Finally, plan for the transition. If you aren’t sure what that might entail, consulting with a team of experts can make sense — saving you significant time and expense. Ready to get started? We help organizations navigate the process of choosing the right CMS from start to finish, but we’re also happy to jump in with a quick consultation if you’re feeling stuck. Set up a 30-minute consultation >> Learn more about martech strategy >> Find out how a CMS fits into your overall customer data strategy >>
How to develop data as an asset
Developing a culture of and commitment to viewing data as an asset within your organization not only ensures good governance — and compliance with evolving privacy regulations — it also gives your business the insights needed to thrive in the rapidly changing digital world. Understand the lifecycle of data as an asset To encourage good data management processes, it’s important to understand the lifecycle of a data asset originating outside of IT. In these cases, data from multiple sources is blended and prepped for consumption, which typically includes steps to validate, cleanse, and optimize data based on the consumption need — and because these processes happen outside of IT, be on the lookout for potential security or governance gaps. While individual circumstances vary, from a big picture perspective the data asset development lifecycle generally follows these steps: Intake: Data assets can only be created or derived from other datasets to which the end-user already has access. While traditionally this was more focused on internal datasets, blending with external data, such as market, weather, or social, is now more common. Ask: How are new requests for information captured? Once captured, how are they reviewed and validated? How is the information grouped or consolidated? How is the information prioritized? Design: Once the initial grouping takes place, seeing data as an asset requires thoughtful design that fits in with the structure of other data sets across the organization. Ask: How will new datasets be rationalized against existing sets? How will common dimensions be conformed? How does the consumption architecture affect the homogeneity of data sets being created? Curation: Depending on the source, data might be more or less reliable, but even lower confidence information can be extremely valuable in aggregate, as we’ve seen historically with third-party cookies. The more varied the sources contributing to a data asset, the greater the need for curation, cleansing, and scoring. Ask: How will the data be cleansed and groomed based on the consumer’s requirements? Will different “quality” or certification levels of the data be needed? Output: Organizations that view data as an asset prioritize sharing across business units and between tools. Consider implementing standards for data asset creation that take connectivity and interoperability into account. Ask: How will data be delivered? Will it include a semantic layer that can be consumed by visualization tools? Will the data asset feed into a more modern data marketplace where customers (end users) can shop for the data they need? Understanding: As a shared resource, data assets require standardized tagging to ensure maximum utility. Ask: How will metadata (technical and business) be managed and made available for consumers for these sets? How is the business glossary populated and managed? Access: To maintain legal and regulatory compliance and avoid costly mistakes, good governance requires access management. Ask: Who will have access to various delivered assets? Will control require row- or column-level security, and if so, what’s the most efficient and secure way to implement those controls? Explore tools that streamline data asset preparation In many organizations, the data asset lifecycle is no longer a linear journey, where all data proceeds from collection to analysis in an orderly progression of steps. With the advent of the data lake, the overall reference architecture for most companies now includes a “marshaling” or staging sector that allows companies to land vast amounts of data — structured, unstructured, semi-structured, what some have labeled collectively as “multi-structured” or “n-structured” — in a single region for retrieval at a later time. Data may later be consumed in its raw form, slightly curated to apply additional structure or transformation, or groomed into highly structured and validated fit-for-purpose, more traditional structures. Podium Data developed a useful metaphor when speaking of these three levels of data asset creation. “Bronze” refers to the raw data ingested with no curation, cleansing, or transformations. “Silver” refers to data that has been groomed in some way to make it analytics-ready. “Gold” refers to data that has been highly curated, schematized, and transformed suitable to be loaded into a more traditional data mart or enterprise data warehouse (EDW) on top of a more traditional relational database management system. To streamline the creation of assets at each of those levels, many organizations adopt self-service tools to ensure standard processes while democratizing asset creation. While the vendor landscape is wide in this area, the following three examples represent key functionality: Podium, like Microsoft and others, adopted a “marketplace” paradigm to describe developing data assets for consumption in a common portal where consumers can “shop” for the data they need. Podium provides its “Prepare” functionality to schematize and transform data residing in Hadoop for a marketplace type of consumption. AtScale is another Hadoop-based platform for the preparation of data. It enables the design of semantic models, meaningful to the business, for consumption by tools like Tableau. Unlike traditional OLAP semantic modeling tools, a separate copy of the data is not persisted in an instantiated cube. Rather, AtScale embraces OLAP more as a conceptual metaphor. For example, when Tableau interacts with a model created in AtScale on top of Hadoop, the behind-the-scenes VizQL (Tableau’s proprietary query language) is translated in real time to SQL on Hadoop, making the storage of the data in a separate instance unnecessary. Alteryx is also a powerful tool for extracting data from Hadoop, manipulating it, then pushing it back into Hadoop for consumption. Keep security in mind It is worthy to note that many self-service tools have a server component to their overall architecture that is used to implement governance controls. Both row-level security (RLS) and column-level security (sometimes referred to as perspectives) can be put in place, and implementations of that security can be accomplished many times in more than one way. Many of these tools can leverage existing group-level permissions and security that exist in your ecosystem today. Work with a consulting services partner or the vendors themselves to understand recommended best practices in configuring the tools you have selected in your environment. Whether you’re evaluating self-service data tools or looking for ways to shift your organization’s culture toward seeing data as an asset, we can help. Fusion’s team of data, technology, and digital experts can help you architect and implement a comprehensive data strategy, or help you get unstuck with a short call, workshop, or the right resources to reframe the questions at hand. Read about key considerations for data democratization >> Learn more about data strategy and how to get started >>
Three technology strategies walk into a bar
Composable enterprises are more fun than they look. If you’ve got a monolithic legacy system on your hands, sticking with the status quo isn’t a fun choice. But going nuclear and building back from scratch probably isn’t realistic. Wouldn’t it be great to find a middle ground? Meet the composable enterprise. It’s an iterative path toward digital transformation, with applications repackaged into components that can be used to build new solutions across the business. Piece by modular piece, you rebuild your technology ecosystem – becoming more efficient, effective, and scalable as you go. As the glue that holds those components together, APIs are key to building a composable business. And developing secure API solutions that accommodate shifting capacity demands and amplify your technology takes a hefty dose of strategy and expertise. That’s what we love about it! If APIs are your jam, too, or if you’re wondering if a composable system makes sense for your business, let’s talk. We recommend reading, Is your composable business enterprise project working or causing tech debt? Get smart: You’ve probably spent the day wondering how speculative/sci-fi/literary fiction relates to API strategy and microservices (or maybe that’s just us). But, we’d guess, the same type of mind that enjoys transforming legacy monoliths into composable enterprises would also really track with a book like How High We Go in the Dark by Sequoia Nagamatsu. Modular pieces linked together by strong bonds leading to an intricate and ever-expanding whole? We’re here for it (the book and the technology strategy).
A horse is a horse, and other martech myths
On being tech-agnostic Martech is a crowded field, and a lot of the voices weighing in on your options have a horse in the race.* No one is out to skew the odds on purpose, but your organization is unique. Just because one solution is a front-runner doesn’t necessarily mean it’s a great fit for your business. So how do you sort the facts from the hype and decide where to place your bets? We rounded up a few martech myths as a starting point. Myth: One CDP is as good as another. Fact Check: Finding the right CDP (or CRM, or DMP, or any other solution you can think of) isn’t a simple box to check. And, once you make your decision, integrating and customizing your platform will also take time and attention. Myth: Everyone needs a CDP. Fact Check: Depending on your use cases, you might be able to do everything you need to do with your CRM. Myth: You should pick a vendor and go all in. Fact Check: It probably goes without saying, but when it comes to technology there are no one-size-fits-all solutions. One product might be a great fit for your needs, but that doesn’t mean that the vendor should then supply your entire tech stack. Myth: Data and tech silos are just the way business works. Fact Check: Regardless of size, scope, or industry, today’s businesses can’t afford to be siloed. When you’re evaluating a tech solution or rethinking your entire customer data strategy, prioritizing integration is always a safe bet. On your mark. Ready to put your martech through the paces? Read on to find resources to help you optimize your stack and get your customer data strategy across the finish line. *Full disclosure: one of our team members won $100 when Rich Strike won the Kentucky Derby, but for the most part we are platform- and livestock-agnostic. Get Smart: We try not to be too on the nose with our book recommendations but couldn’t help ourselves this time. In Data Strategy, Bernard Marr collects a solid primer on the data landscape and how your organization can use it (legally and ethically) to advance your goals. Spoiler alert, though probably not surprising given the title, strategy turns out to be the foundational driver for effective data use. Whether you read the book, our Ultimate Guide to Customer Data Strategy, or just want to get a sense of potential next steps, we’d love to chat about customer data.
CDP vs CRM: 5 key questions to inform your decision
The difference between a customer data platform (CDP) and customer relationship management (CRM) solution may be difficult to determine at first, because both options collect, store, and put customer data to use in support of business goals. While their functions may overlap, the CDP vs CRM debate becomes easier when you get clarity about the people, processes, and use cases for each option. How to make the CRM vs CDP decision 1. What is a CDP? 2. What is a CRM? 3. What data is collected by a CRM vs CDP? 4. Who uses a CDP vs CRM and for what purpose? 5. What do we need: a CDP, CRM, or both? 1. What is a CDP? A CDP unifies and standardizes large and detailed data sets from a wide variety of sources, resulting in robust customer profiles that enable real-time personalization. The CDP Institute defines a CDP as “packaged software that creates a persistent, unified customer database that is accessible to other systems.” Additionally, a CDP must have the following capabilities: Ingest data from any source Capture full detail of ingested data Store ingested data indefinitely (subject to privacy constraints) Create unified profiles of identified individuals Share data with any system that needs it Through the process of identity resolution, the CDP can match, merge, and deduplicate data into a single customer view that can be segmented and analyzed — by human analysts or with the assistance of machine learning. 2. What is a CRM? A CRM and a CDP are both software solutions that handle customer data, but they differ in how, why, and who for. The difference came about organically, as organizations adopted different use cases for their customer data over time. “CRM solutions were often proposed to tackle customer data management problems. The idea was that you could get ‘all of your data in one place’ to use for sales, marketing, and customer service. The promise was they’d break down silos in enterprises and design a view of the customer that wasn’t specific to sales or marketing or customer service. That sounds familiar to the promise of CDPs, doesn’t it?” — Lizzy Foo Kune, senior director analyst at Gartner A CRM helps organizations manage customer relationships by consolidating what is known about customers from one-to-one touchpoints and transactional details into a single database, giving sales and service teams personal and actionable insights. According to the Microsoft Dynamics 365 website “CRM systems help you manage and maintain customer relationships, track sales leads, marketing, and pipeline, and deliver actionable data.” Sound similar to a CDP? There’s a key difference: CRMs only apply to known customers and contacts. Moreover, they don’t cleanse, combine, standardize, or deduplicate the customer records, so they can’t give a business a “single customer view” across channels. 3. What data is collected in a CRM vs CDP? That key difference reflects the two business silos that CRMs were developed to unite: marketing and sales. Marketing needs a high volume of customer data across touchpoints in a single, unified view to understand your customers and their behavior. CDPs collect digital data automatically using integrations and code snippets embedded in digital touchpoints, gathering customer data from websites, laptops, mobile devices, apps, and even CRMs into one place. The CDP then cleans it, and produces consolidated customer. Sales needs customer data to help manage the customer relationship. CRMs store historical data about customer interactions in order to inform future interactions. The data CRMs collect is usually entered manually and its purpose is tightly focused on logging an interpersonal or transactional interaction — for example, notes from the latest sales call – to inform future interactions. The data inputs are simple, although difficult to standardize or automate, and are usually done manually by sales (and service) people to track the progress of the relationship. 4. Who uses a CDP vs CRM and for what purpose? Your organization’s CDP vs CRM discussions may come down to who needs to use the system to accomplish critical business tasks. As we’ve said above, marketers need a unified view of the customer’s entire experience of the brand over time. A CDP’s ability to ingest, cleanse, manage, and analyze large volumes of data from many digital sources makes that task easier. But for sales and support teams, the key driver is managing customer relationships. In these customer-facing roles, contact management is critical, so a CRM’s ability to capture notes and manual inputs about one-to-one interactions facilitates that function. 5. What do we need: a CRM, CDP, or both? While choosing between solutions isn’t easy, it’s not necessarily an either/or decision. You might find a both/and solution serves your business better. How do you make the call? If your business primarily needs to manage customer relationships in a more detailed, efficient, and personalized way, you might choose a CRM. In fact, over the last few years, CRMs have been innovating and evolving to function more and more like CDPs so it might be prudent to wait and/or choose vendors carefully. Gartner predicts that 70% of independent CDP vendors will be acquired by larger technology vendors or will diversify by 2023. “CRM systems have seen the competitive threat that CDPs brought to the table,” Gartner’s Foo Kune said. “As CRM technologies recognize that they need to update their aging databases to meet the needs of modern business functions, including marketing, augmenting your CRM with a CDP may be unnecessary.” If your business primarily needs to have a broad view of who your customers are and how they engage with your business, you may opt for a CDP. “Companies seeking a new strategy to form personalized customer experiences through data will need a CDP as it offers the resources to create a comprehensive view of the customer across each platform they interact with in real-time — whether it’s social media, apps or mobile,” says Heidi Bullock of Tealium, a CDP provider. “CRMs, on the other hand, help manage sales-focused customer data rather than collecting data across different channels.” And, if your business is broad you can choose both a CDP and a CRM. While CDPs and CRMs offer two different marketing and sales data management solutions with differing strengths, you don’t necessarily have to choose between them. “CDPs and CRMs can actually operate simultaneously, as they work to fulfill different business goals,” Tealium’s Bullock notes. It’s possible to use a CRM as an input and output channel to a CDP, and, in turn, use a CDP to provide a 360° customer view data set within the CRM. Choosing both a CDP and a CRM can deliver both an amazing customer experience and tremendous business value: achieving high marks in customer satisfaction and providing integrated tracking and engagement. The CDP vs CRM choice depends on your roadmap. Fusion works with clients to define a customer data strategy that fits each organization’s unique strategic objectives, operational needs, and timeline. From there, our team creates a tactical roadmap to define actionable steps toward those goals. Whether you’re just getting started or trying to get your digital transformation back on track, we can help. Ask a question >> Book a workshop >> Learn more about customer data strategy >>
Data is the holy grail
Are you using coconuts? Make your quest more efficient. In the classic film Monty Python and the Holy Grail, viewers hear King Arthur and his trusty servant Patsy approaching with a trademark “clip-clop, clip-clop" sound. When the duo emerges from the primordial mist, you see (spoiler alert) that the source of all this noise is not, as might be supposed, a horse. Rather, Patsy is banging two coconut shells together as the king trots about on his own two legs. The duo is getting from point A to point B in their quest, but not in the most efficient or effective way possible. So many companies follow that script. Equipped with buzzword mandates like process optimization and data-driven decision making, it’s all too easy to make small adjustments that sound like you’re headed in the right direction but aren’t necessarily getting you there any faster. How do you drop the coconuts and get on the horse (metaphorically speaking)? What does it look like to use data to drive optimization in real terms? We’ve got our eye on digital twins. Before you run away (how’s that for a deep cut Monty Python reference?) from yet another data buzzword, it’s worth another look at this practical application of machine learning and data analytics. Digital twins are most often used to optimize physical assets and processes like manufacturing, warehousing, and logistics. Using sensors to collect data on a product, machine, or physical process, the digital twin feeds real-time data to a machine learning algorithm to test variables and scenarios faster – ultimately leading to actionable process improvement insights. These days, we’re starting to see more businesses use digital twin frameworks to optimize and innovate non-physical business processes like accounting, HR, and marketing as well. A digital twin simulation can help you surface interdependencies and inefficiencies that might otherwise be blind spots, especially if they’re baked into your business culture as “the way we’ve always done it.” In the quest for digital transformation, don’t settle for coconuts. Instead, let’s talk about the ways your data can carry more of the weight for you. Get smart: If all this talk of Monty Python and the Holy Grail puts you in the mood for an old- school movie night, good news: it’s available on Netflix. And if you’re looking for a more literary scratch for your Middle Ages (ish) itch, we’re reading Cathedral by Ben Hopkins. It’s a fascinating look at the complex processes involved in constructing architectural marvels in the days before edge computing. We may handle optimization differently now, but human nature stays the same.
Fuse Highlights: Food for thought
Every few weeks, we share insights with our Fuse subscribers along with news and trends we’re following across the web. Here’s a compilation of some of our key insights from last quarter. If you want content like this delivered directly to your inbox, we’ve got you covered. Subscribe to the Fuse here. Data literacy: Food for thought How do you get from buzzwords like “data literacy” and “data culture” to confidence that data is driving better decisions across the business? If Peter Drucker was right — and he’s Peter Drucker, so he probably was — culture eats strategy for breakfast. It’s not enough to build a business case for data. You need a business culture to support it. In our experience, success starts with aligning people, processes, and business goals with purpose-built data and technology solutions. When people understand what data makes possible and how it impacts their job — where to find it, and how to read and interpret the data they need — convincing them to use it to drive better decision making is a much easier lift. Easier said than done? You bet. We love a complicated algorithm or elegant data architecture, and we’re basically ninjas at selling business cases (if we do say so ourselves). But there’s a reason Fusion stakes a claim on being people-focused. Because we don’t just love data. We love when it works. Get smart: If you’re looking for an overview of data culture and a baseline for building data literacy across your organization, we recommend Be Data Literate by Jordan Morrow. Although written as a primer for individuals, the book’s framework could easily be used as a springboard for helping your whole company level up its data acumen. Read the full Fuse: Data for March. A one-brain approach to B2B marketing In AppleTV+’s bizarrely compelling drama Severance, employees’ brains are modified to separate their work memories from their off-work thoughts. Of course, what makes the show sci-fi is the fact that no one really has a “work self” and a “life self.” So, why does B2B marketing often seem to assume that consumers and business purchasers are different people? Compare your IG feed to the LinkedIn ads you’re served. One platform shows you talking Australian lizards. The other shows you text about processing speeds. When you need insurance, you remember where to go. When it’s time to make a CMS platform decision you…probably should have made a note. We want to believe that our B2B customers make purely rational decisions, but experience and data suggest otherwise. Whether it’s B2C or B2B, people predominantly buy from emotion, not stats and features. Creative marketers who are willing to push the envelope can capitalize on this idea to stand out in the sleepy B2B marketing landscape. It’s hard to argue with results. One of our clients, a pharma sales enablement company, saw 3x lead growth when they pivoted from standard B2B ads to a brighter, more engaging campaign direction. Your B2B targets don’t come to work as a separate persona. Creative marketing captures attention with a whole-brain approach. Ready to ditch the sinister work-life lobotomy assumptions? We’re always ready to talk about how to set your brand apart, whether it’s new creative or a streamlined martech stack. Let us know how we can help. Get smart: Wondering how to sell creative marketing internally? We’ve been reading The Human Element: Overcoming the Resistance that Awaits New Ideas and thinking through the authors’ framework for overcoming our natural resistance to change — especially as it applies to organizations. If you’re struggling through a shift, this book could be worth your time. Read the full Fuse: Marketing for March. Put your technology on a balanced diet Tech creep is kind of like strolling the cereal aisle with a four-year-old (or a 34-year-old, no judgment) who begs for the choco-sugar-neon-behavior-bombs instead of the sensible-fiber-nut-loops you had planned. When it comes to building your tech stack or stocking your pantry, “it looked cool” isn’t really a strategy. And yet, for many companies, an enterprise architecture hodge-podged out of whatever looked good at the time often gets the job done. Until it doesn’t. A move to the cloud, a new data privacy mandate, or even the increasing demand for speed and agility to stay competitive might expose the imbalance in your tech stack. How do you get back to a more wholesome view? Realigning your solutions with your organizational goals and objectives is a great start. Regardless of how long you’ve been using it, does every piece of your technology still fit your plan? You might need to let go of sunk costs and admit that a tool has gotten a little soggy for your current needs. You might need to put your appetite for shiny new solutions on a diet. At the risk of straining our balanced breakfast metaphor past the breaking point (too late?), we recommend putting a healthy strategy on the menu. As guidelines change and organizations shift to keep up, this is a great time to reassess your tools and processes. In its simplest form, a refreshed technology strategy includes a current state audit, an ideal state articulation, and a plan to bridge the gap. Whether your internal culture skews Team Sugar-Bombs or Team Fiber-Loops, we can help you take a strategic view and bring your technology stack back into balance. Get smart: We get that it’s a little bit ironic for a bunch of tech consultants to recommend a book like Cal Newport’s Digital Minimalism. But hear us out. Newport’s approach to consumer technology – that tech and platforms should have to earn their place in your life by proving that they help you meet your goals and values – has some merit for the business world as well. We’ve all seen what Newport terms “maximalism” at play in sprawling, bolted together legacy architectures. Maybe the time has come for a more minimalist, goal-driven tech stack. Whether you’re ready to start over or looking for ways to modernize what you have, we’re always happy to talk technology strategy. Read the full Fuse: Technology for April.
Put your technology on a balanced diet
Your tech stack is like breakfast cereal. Tech creep is kind of like strolling the cereal aisle with a four-year-old (or a 34-year-old, no judgement) who begs for the choco-sugar-neon-behavior-bombs instead of the sensible-fiber-nut-loops you had planned. When it comes to building your tech stack or stocking your pantry, “it looked cool” isn’t really a strategy. And yet, for many companies, an enterprise architecture hodge-podged out of whatever looked good at the time often gets the job done. Until it doesn’t. A move to the cloud, a new data privacy mandate, or even the increasing demand for speed and agility to stay competitive might expose the imbalance in your tech stack. How do you get back to a more wholesome view? Realigning your solutions with your organizational goals and objectives is a great start. Regardless of how long you’ve been using it, does every piece of your technology still fit your plan? You might need to let go of sunk costs and admit that a tool has gotten a little soggy for your current needs. You might need to put your appetite for shiny new solutions on a diet. At the risk of straining our balanced breakfast metaphor past the breaking point (too late?) we recommend putting a healthy strategy on the menu. As guidelines change and organizations shift to keep up, this is a great time to reassess your tools and processes. In its simplest form, a refreshed technology strategy includes a current state audit, an ideal state articulation, and a plan to bridge the gap. Whether your internal culture skews Team Sugar-Bombs or Team Fiber-Loops, we can help you take a strategic view and bring your technology stack back into balance. Get smart: We get that it’s a little bit ironic for a bunch of tech consultants to recommend a book like Cal Newport’s Digital Minimalism. But hear us out. Newport’s approach to consumer technology – that tech and platforms should have to earn their place in your life by proving that they help you meet your goals and values – has some merit for the business world as well. We’ve all seen what Newport terms “maximalism” at play in sprawling, bolted together legacy architectures. Maybe the time has come for a more minimalist, goal-driven tech stack. Whether you’re ready to start over or looking for ways to modernize what you have, we’re always happy to talk technology strategy.
Severance isn't a B2B documentary
AppleTV+ has your breakthrough. In AppleTV+’s bizarrely compelling drama Severance, employees’ brains are modified to separate their work memories from their off-work thoughts. Of course, what makes the show sci-fi is the fact that no one really has a “work self” and a “life self.” So, why does B2B marketing so often seem to assume that consumers and business purchasers are different people? Compare your IG feed to the LinkedIn ads you’re served. One platform shows you talking Australian lizards. The other shows you text about processing speeds. When you need insurance, you remember where to go. When it’s time to make a CMS platform decision you…probably should have made a note. We want to believe that our B2B customers make purely rational decisions, but experience and data suggest otherwise. Whether it’s B2C or B2B, people predominantly buy from emotion, not stats and features. Creative marketers who are willing to push the envelope can capitalize on this idea to stand out in the sleepy B2B marketing landscape. It’s hard to argue with results. One of our clients, a pharma sales enablement company, saw 6x lead growth when they pivoted from standard B2B ads to a brighter, more engaging campaign direction. Your B2B targets don’t come to work as a separate persona. Creative marketing captures attention with a whole brain approach. Ready to ditch the sinister work-life lobotomy assumptions? We’re always ready to talk about how to set your brand apart, whether it’s new creative or a streamlined martech stack. Let us know how we can help. Get smart: Wondering how to sell creative marketing internally? We’ve been reading The Human Element: Overcoming the Resistance that Awaits New Ideas and thinking through the authors’ framework for overcoming our natural resistance to change – especially as it applies to organizations. If you’re struggling through a shift, this book could be worth your reading time.
Data literacy on toast
Culture. It's what's for breakfast. When it comes to implementing emerging technologies and advanced analytics, it’s easy to get caught up in building the business case. But we can all think of businesses that devoted significant time and resources to leading-edge solutions and still failed to see results. If Peter Drucker was right – and he’s Peter Drucker, so he probably was – culture eats strategy for breakfast. This is not to say that your data initiatives are doomed to the frying pan (sorry, we couldn’t resist). Rather, to deliver value quickly, you can’t stop at data strategy, quality, and governance. It’s not enough to build a business case for data. You need a business culture to support it. Where do you start? How do you get from buzzwords like “data literacy” and “data culture” to confidence that data is driving better decisions across the business? In our experience, success starts with aligning people, processes, and business goals with purpose-built data and technology solutions. When people understand what data makes possible and how it impacts their job, where to find it, and how to read and interpret the data they need, convincing them to use it to drive better decision making is a much easier lift. Easier said than done? You bet. We love a complicated algorithm or elegant data architecture, and we’re basically ninjas at selling business cases through (if we do say so ourselves). But there’s a reason Fusion stakes a claim on being people-focused. Because we don’t just love data. We love when it works. Get smart: If you’re looking for an overview of data culture and a baseline for building data literacy across your organization, we recommend Be Data Literate by Jordan Morrow. Although written as a primer for individuals, the book’s framework could easily be used as a springboard for helping your whole company level up its data acumen.
Catalyst SDM&A: How to make your data deliver
Organizations today are inundated with data from different sources. This data can help you make better business decisions, improve customer interactions and retention, and create more intentional strategic plans. But none of that is possible if you can’t trust or access your data. Some of the common pain points we see around data in organizations are: There is no single source of truth to use to make decisions Managers don’t have the right data available in real time to accelerate decision-making Time is wasted searching for and reconciling data The data exists, but is not keeping up with business needs Concerns about compliance Lack of trust in data quality That’s why we created our proprietary Catalyst Strategic Data Management & Analytics (SDM&A) Framework — a comprehensive and flexible framework that enables you to examine your business across all domains of data and analytics maturity. Fusion’s Catalyst SDM&A Without the right framework, your information is useless to your organization. We work to ensure that the following data components are accounted for in your strategy and that you’re able to execute in a way that moves your business goals forward. Based on our experience, we’ve found that companies need a holistic, 360° view of the SDM&A landscape in order to decide how to proceed most efficiently while delivering maximum business value. Using our framework, you can: Understand your business’s current data maturity level across the seven critical domains shown above Identify gaps or deficiencies between your desired state and your current state, and walk away with a roadmap that gets you where you want to go Align data and analytic investments with business strategy, goals, and objectives. Our Catalyst SDM&A Framework is completely customizable, so we are able to meet you wherever you are on your data journey and create a solution that meets your unique needs. Click here to learn more about our strategic data management services.
Future-proofing your digital transformation
We've got your Magic 4 Ball If there’s one thing the past two years have shown us, it’s that future-proofing your plans isn’t as easy as it looks. We live in a time of rapid change. And so do your customers. We can’t advise you about taking that cruise, who will win the Super Bowl, or what color to paint your office*. But, as 2022 takes the floor, we’re keeping an eye on the trends that are most likely to impact your digital transformation. Here’s what we’re looking out for: Technology: The trend toward industry-specific vertical clouds that deliver platforms and infrastructure as a service designed around regulatory and security requirements. Data: Comprehensive customer data strategies that take integrity, governance, infrastructure, BI, and analytics into account. Digital: More cross-functional ownership of digital transformations, and a resulting increase in solutions that integrate data, technology, and new ways of working. Marketing: The ripple effect from third-party cookie deprecation inspires exciting innovation in first-party-data-driven tactics, proving that privacy and impact aren’t mutually exclusive. We can’t predict the future, but we can stay ready for it. *Oh, and we vote “later,” “Bengals,” and “blue” for those first questions. Thank us later. Get smart: If you’re looking to deepen your understanding of data and its importance to your customers, your business, and your daily life, our technology practice recommends reading Making Numbers Count by Chip Heath. And any time you want to have a conversation about data impact, we’re up for it!
Expect the unexpected: How to create a disaster recovery plan
Tornado. Fire. A malicious cyberattack. What will your organization do if and when the unexpected happens? If you have completed planning for disaster recovery, when is the last time you revisited it? The COVID-19 pandemic, for many, was a test of disaster preparedness and a reminder that planning for the worst can help your company do its best. Disaster recovery is the planning and documentation done in advance to help your organization survive and recover in case of a catastrophe. The disruption could be a natural disaster, accidental data loss, or an attempt to disrupt your network, among others. These interruptions can lead to revenue losses, damage to your brand and reputation, and potentially, the loss of customers. The longer your recovery time, the greater the impact on your organization. A comprehensive disaster recovery plan should allow you to expand your recovery capability and improve your resilience. Having a plan enables you to focus, prioritize assets, and determine the best approach to recover normal operations. Why do you need a disaster recovery plan? Every organization must be prepared to weather worst-case scenario events. You might already have a business continuity plan, but it’s worthwhile to note that this is not the same as a disaster recovery plan. Business continuity focuses on keeping your business operational during a disaster, while disaster recovery focuses on restoring data access and IT infrastructure after a disaster. Ensuring that you have an effective disaster recovery plan can reduce the magnitude of damage to your organization and keep the business running. With carefully planning, you can document your organization’s key assets and create a plan to recover them. Key steps to creating a disaster recovery plan The following list can help you document and think through your organization’s approach to a disaster situation. Once you compile the information, store the document in a safe, accessible location off-site. If you have an existing disaster recovery plan, have you updated it recently? When disaster strikes, having an outdated plan can leave you scrambling. It’s important to view your disaster recovery plan as a constant work in progress. 1. Define your key assets For many organizations, your most important digital assets include enterprise resource planning (ERP) system, product, and marketing plans. Prioritize your assets into three categories: business-critical, important, and non-critical. Categorizing your assets is important to help your organization best understand needs during recovery from a disaster. The focus should be on recovering the most vital systems first. 2. Decide on a recovery window Based on your asset prioritization, determine the duration of your Maximum Tolerable Downtime (MTD). Business unit owners should identify the Recovery Time Objective (RTO); this is usually one segment of the MTD. Business-critical assets are the most immediate focus during a recovery effort. Your recovery window should identify the MTD (in hours) that your business can thrive without having access to the most important information. 3. Define a recovery solution There are many options available for backing up your data, from tape backups to disk backup and cloud storage. In case of a disaster, your organization would need to utilize remote storage so that your data is safe and accessible, even if the primary location is destroyed. In addition, all recovery plans and system build notes should be kept offsite. 4. Draft a disaster recovery plan A disaster recovery document is not a fixed document. It should be flexible to ensure asset retrieval regardless of the disaster. Your plan should contain enough detail so that someone having technical understanding but no knowledge of your enterprise structure can understand. 5. Test the plan Testing your disaster recovery plan is a critical part of preparation and the only way to ensure viability. There are different stages of testing: Paper test: Individuals read and annotate recovery plans Walkthrough test: Groups walk through plans to identify issues and changes Simulation: Groups go through a simulated disaster to identify whether emergency response plans are adequate Parallel test: Recovery systems are set up and tested to determine if they can perform actual business transactions to support key processes, primary systems still carry the full production workload Cutover test: Recovery systems are set up to assume full production workload, temporarily disconnecting primary systems 6. Schedule edits and follow up regularly Operations and data change constantly. It isn’t enough to rely on a single test of your disaster recovery plan to ensure success when a disaster occurs. Document restoration and testing procedures and run the tests on a regular basis. The frequency of your testing may vary, but at least every 12 months. The frequency of your testing depends on your evaluation of the rate of change in your systems. 7. Maintain your plan Technology and business-critical applications change. Disaster Recovery Plans should be reviewed quarterly at a minimum to make improvements and changes to ensure critical business system recovery. Next steps for your disaster recovery efforts Work with your organization’s process owners to gather the details needed to make a complete disaster recovery plan. You might find it worthwhile to partner with a consultant like Fusion Alliance to guide you through the process and incorporate this plan into your wider technology strategy, thinking through business processes and needs. For more information about disaster recovery planning and our services, drop us a line. We would love to talk about your organization’s needs.
Data executives empowered: The importance of the data community and elevating the CDO
Almost 20 years ago, Capital One recognized the need for one person to oversee their data security, quality, and privacy, and the role of the Chief Data Officer was born. Now reports show that 68% of organizations have a CDO (Harvard Business Review, 2020). And while the role has become more common and has significantly evolved, many data executives are still struggling to get a seat at the table or bring data to the forefront of their organization. In fact, in a recent survey, only 28% of respondents agreed that the role was successful and established. Company leaders agree that there needs to be a single point of accountability to manage the various dimensions of data inside and outside of the enterprise, including the quality and availability of that data. But now we are at a crossroads — what is the best way to align the work that the CDO does with the strategy of the business as a whole? The reality is that CDOs often struggle to find the internal and external support and resources needed to educate others to align with the organization’s goals. Implementing enterprise data governance, data architecture, data asset development, data science, and advanced analytics capabilities — such as machine learning and video analytics — at scale, is not an easy task. To be successful, data executives need support, resources, and communities focused on the elevation of data. We are proud to continue to help these communities come to life for the benefit of our colleagues and clients, establishing local resources here in the Midwest with global scale, reach, and impact. Read on as Mark Johnson, our Executive Leader for Data Management and Analytics, provides insight on the current state of data and the CDO, and provides details on multiple opportunities for data leaders of different levels to get more involved in the data community. Q: How has the role of data changed/evolved for organizations? The reality is that information is everything. This global pandemic proved that to many organizations. For some, it showed that their digital network was ready, and they were aptly prepared to take on COVID. For others, it has forced them to recognize their own immaturity with data and analytics. On its own, managing data is not exciting — the information just sort of exists. To give data value, you have to put it to use. And so, I think we are going to see the Chief Data Officer and/or Chief Data Analytics Officer really find their own in the coming years. It’s time for their seat at the table. The C-suite is now asking questions that can only be answered with data, and now they truly understand both the value and consequences of the data game. Q: What do you think are the biggest challenges facing CDOs/data leaders today? I think that the biggest challenge for data executives today is the acquisition of talent that is seasoned and experienced where you need them to be for your organization. Higher education hasn’t necessarily kept up with the data world, and often times it takes additional training to reach the right levels. The reality is that right now the talent is manufactured in the real world. Data executives have to be connected and equipped to mentor, train, and keep the right people. Q: You’ve mentioned that data leaders need to connect with each other. What value can people expect from these data communities? I think there is tremendous value. As we are seeing the power of data evolve in organizations, and the role of data leaders evolve as well, I think coming together to collaborate and share elevates the leader, the organization, and the view of data as a whole. In these communities, it gives people a safe space to talk about how they are doing, what they are doing, what their biggest challenges are, and what solutions are working for them. These communities have truly become both a learning laboratory and an accelerator for data. Q: As a big proponent of connecting data leaders, you have been involved in creating different opportunities for people to get together. What groups/events would you recommend, and how can people get involved? I personally have been involved with the MIT Chief Data Officer and Information Quality Symposium (MIT CDOIQ), which is such a great opportunity to start with for connection. It has developed into additional opportunities for data leaders at all levels to get involved and create the kind of community we need to truly elevate the value of data. Organizations like the CDO Magazine, the creation of CDO roundtables across the nation, and the International Society of Chief Data Officers (isCDO) all evolved from connecting data leaders and identifying common challenges. MIT CDOIQ: The International MIT Chief Data Officer and Information Quality Symposium (MIT CDOIQ) is one of the key events for sharing and exchanging cutting-edge ideas and creating a space for discussion between data executives across industries. While resolving data issues at the Department of Defense, the symposium founder, Dr. Wang, recognized the need to bring data people together. Now in its 15th year, MIT CDOIQ is a premier event designed to advance knowledge, accelerate the adoption of the role of the Chief Data Officer, and change how data is leveraged in organizations across industries and geographies. Fusion has been a sponsor of this symposium for seven years now, and we are so excited to see how the event has grown. Designed for the CDO or top data executive in your organization, this is a space to really connect with other top industry leaders. CDO Roundtables Fusion has always been focused on building community and connecting people. And when one of our clients, a Fortune 500 retailer, mentioned wanting to talk with other data leaders from similar corporations, we realized that there was a big gap here — there was no space that existed where data leaders could informally come together, without sales pitches and vendor influence, and simply talk. That’s how the CDO roundtables were born — a place that allows data leaders to get to know each other, collaborate, accelerate knowledge growth, and problem solve. We just started two years ago in Cincinnati, but now, now we’ve expanded to multiple markets including Indianapolis, Columbus, Cleveland, Chicago, and Miami. These groups are designed for your CDO/CDAO and truly create an environment for unfiltered peer-to-peer discussion that helps solves data leadership challenges across industries. If you’re interested in joining one of these roundtables or starting one in your market, email me or message me on LinkedIn. I’m here and ready to get these roundtables started with executives in as many communities as I can. The more communities we have, the more data leaders and organizations we can serve. International Society of Chief Data Officers (isCDO) Launched out of the MIT CDOIQ symposium, the isCDO is a vendor-neutral organization designed to promote data leadership. I am excited to be a founding member of this organization, along with our Vice President of Strategy, David Levine. Our ultimate goal is to create a space that serves as a peer-advisory resource and enables enterprises to truly realize the value of data-driven decision making. With multiple membership options available, isCDO is the perfect opportunity for data leaders looking to connect with their peers and gain a competitive advantage by focusing on high-quality data and analytics. CDO Magazine I am really proud to be a founder of the CDO magazine, as it really is a resource for all business leaders, not just the CDO. We designed the magazine to be a resource for C-suite leaders — to educate and inform on the value proposition, strategies, and best practices that optimize long-term business value from investments in enterprise data management and analytics capabilities. Check out the publication here. And if you’re interested in contributing content or being interviewed, let me know at email@example.com. Closing: The role of the CDO is integral to organizations, but it’s still evolving. Now more than ever, it is important that data leaders come together to collaborate and problem-solve. Fusion is excited to be a part of each of these initiatives, and we are committed to being an agent of change in the communities we serve and beyond. By connecting global thought leaders we believe that organizations will realize the value of data to power their digital transformation. If you’re interested in joining any of these data communities or just have questions, feel free to reach out to Mark via email or on LinkedIn.
Jumpstart your business processes: Using hyperautomation to achieve speed and scale
In a few short years, hyperautomation, or intelligent automation, has gone from a relatively unknown term to a word used across the technology spectrum. Gartner’s Strategic Technology Trends for 2020 named hyperautomation the #1 strategic technology trend for the year. Gartner also forecasted that the hyperautomation software market will reach nearly $600 billion by 2022. What’s fueling the investment? Organizations are trying to remain competitive by decreasing costs and increasing productivity. A focus on hyperautomation can address business challenges and improve operational efficiency, not to mention elevating the customer experience. “Hyperautomation has shifted from an option to a condition of survival,” said Fabrizio Biscotti, research vice president at Gartner in a recent press release. “Organizations will require more IT and business process automation as they are forced to accelerate digital transformation plans in a post-COVID-19, digital-first world.” The foundation of hyperautomation With Robotic Process Automation (RPA) at its core, hyperautomation incorporates advanced technologies — including artificial intelligence (AI), machine learning (ML), natural language processing, optical character recognition (OCR), process mining, and others — to not only automate tasks typically completed by humans but also to build intelligence into the processes, as well as the information derived from those processes. By building on RPA, hyperautomation elevates workflow automation to make decisions previously made by people. It augments the power and value of what RPA provides with a proven path to applying AI to improve business operations. Hyperautomation and digital transformation Because of the level of automation that can be achieved, hyperautomation is commonly referred to as the next major phase of digital transformation. And, it’s an intricate process. Organizations must implement automation simultaneously on multiple fronts to reach the end goal of hyperautomation. They often need to partner with digital innovation advisors and technology consultants to create a hyperautomation strategy from top to bottom and take all of the organization’s nuances into account. To achieve scalability, disparate automation technologies must work together. Careful planning, implementation, and improvement of processes are accomplished through intelligent business process management (BPM). BPM is a core component of hyperautomation and supports long-term sustainability and operational excellence. The combination of BPM solutions with low-code, RPA, AI, and ML has become a driving force for digital transformations, integrating essential data, connecting your workforce, and developing applications. It is up to technology leaders to create a clear strategy, set objectives, and prioritize actions across all business operations. Doing so ensures that the application of automation is efficient. Employees on the front lines are also in an excellent position to identify which processes would provide the most benefit from automation. This can be supported by implementing a demand management solution. Then, it can then be synchronized with organizations’ change management to ensure employees understand the changes and are prepared for more advanced processes, thus elevating the workforce. Organizations may be wary of the costs of change on such a large scale, but the process of integrating technologies does not always require creating a new infrastructure to replace manual operations. Many RPA, AI, and ML solutions can be integrated into automation and technologies that already exist. The future of hyperautomation The next generation of hyperautomation includes support for more complex processes and long-running workflows. Software robots will be able to interact with business users across core business functions, directly impacting the customer experience. Hyperautomation represents the next step in intelligent automation and will transform how we will work in the future. It allows businesses to protect their investments through a holistic approach to digital transformation. As hyperautomation becomes more prevalent, we will realize a seamless and equal blend of robotics, human employees, and existing systems, which will all work collaboratively in a way never seen before. No matter your industry, hyperautomation is worth consideration for its potential cost savings, intelligent processing, intelligence mining, employee efficiencies, and customer service improvements. Learn more about how hyperautomation technologies like ML and AI can benefit you.
Gone phishing: The importance of security awareness training for employees
Are you spending increasing amounts of time reacting to incidents where an end-user clicked on something, downloaded an unknown file, or entered credentials for a document they thought a coworker sent to them? It’s not just you. A recent survey confirmed that cybersecurity threats are on the rise. 53% of IT professionals surveyed indicated an increase in phishing activity since the start of the COVID-19 pandemic. With remote work continuing for many employees, IT departments find themselves playing defense against these cyberthreats. Sophisticated phishing techniques can catch even the most well-meaning employees off guard. Regardless of how your network is monitored, secured, and maintained, the “human firewall” can be the weakest link in the chain. To combat this, practical security awareness training has become vital. The need for security awareness training Security awareness training is necessary to teach employees how to identify potential threats. All employees, regardless of job title and function, are susceptible to attacks. A 2020 MediaPRO and Osterman Research study found that only 17% of employees are very confident that they can identify a social engineering attack, while more than one-quarter of employees (28%) admitted a lack of confidence in identifying a phishing email. Because company information is readily available through mobile devices, tablets, and laptops, there is always a risk of accidental exposure. Offhand clicks, done without hovering over a link, can spell disaster. Even two-factor authentication isn’t safe from social engineering schemes to obtain passwords and logins. Importance of security-minded culture Establishing a culture of security-minded employees goes beyond learning modules and quizzes. Security is the responsibility of all employees that have access to corporate systems. Awareness and training are ongoing activities, not a checkbox to complete once a year. By recognizing good behavior (i.e., thanking employees for forwarding suspicious emails along to the Help Desk), you can continuously instill the importance of each employee’s part to protect the company. You should use all incidents as teachable moments. But there are some other, less obvious benefits of the security-minded culture. A security-minded culture protects assets The average cost of a data breach in 2020 was a staggering $3.86M. Companies need to defend themselves by helping to increase the effectiveness of the “human firewall.” A security-minded culture empowers employees Security awareness training can reduce human error and empower your staff to know when an incident is happening. By preparing employees and enabling them to take action (i.e., feeling comfortable saying no when a caller posing as an executive requests sensitive passwords), you will improve employee reaction time and empower your organization’s employees to make decisions to help the organization. A security-minded culture prevents downtime Time is money, and downtime can create a significant loss of revenue. When an incident occurs, systems can be taken offline to properly investigate and recover from an incident. If your employees are more security-minded, there will be fewer incidents that cause downtime. A security-minded culture ensures compliance Some industries have enhanced scrutiny for employee security awareness. Conducting training ensures that you meet regulations and show that you are doing your due diligence as an employer and vendor. How to build a successful security awareness training program Creating, or even improving your security awareness training program, doesn’t have to be a massive undertaking. Because this subject is so top-of-mind, you might find that now is the perfect opportunity to engage your organization and use the momentum to your advantage. Here are some steps to get you started: Step 1: Gain stakeholder backing Unfortunately, security can be viewed as a low-value cost center. It is crucial to make sure your program has senior leadership support. Providing research data and your current metrics on the current number of phishing emails your organization receives can help you explain the need for investment. Step 2: Define security awareness education goals Not all organizations will have the same plans for the subject matter, employee participation, and education methods. Identify security training that meets the needs of your business. Step 3: Assess your audience Because security is an organizational issue, your audience probably consists of a wide variety of backgrounds and skillsets. Not everyone going through training is well-versed in cybersecurity, and not everyone learns the same way. Get to know your audience and ensure you are aiming to meet their needs. Step 4: Develop a program The education you provide could be administered in many ways, including learning management modules, presentations, and onsite Q&A sessions. Your company should also be performing regular phishing tests to simulate outside threats. Step 5: Perform ongoing training Awareness training is not something that should just be done annually, but rather something that takes place on a regular cadence that makes sense for your organization. Making security guidance and education routine ensures that your employees keep up to date. Emerging threats are continuously discovered. Your company culture and meeting cadence can best determine the frequency and methods that work for you. Step 6: Track results Metrics provide insight into the effectiveness of the training, as well as provide measurable reports to leadership. Successful training will lead to more reported incidents as employees become more aware. The percentage of employees who have completed training, number of phishing exercises, and total real phishing threats detected are significant numbers to measure. Gone phishing Security awareness training is an essential part of any IT strategy, and one that you can’t afford to put off. Remote work paired with an increase in phishing threats creates a dangerous liability for your organization. All employees, regardless of position, need training to prevent a security incident. Find out more about how Fusion Alliance works with clients to improve security awareness: We partnered with a large, Ohio-based utility company to reduce the risk created when employees use their personal devices at work.
Wearables and mobile machine learning change the face of healthcare
Advances in mobile health technology have transformed the entire landscape of healthcare, including the ability of physician groups, employers, nursing home facilities, and pharmaceutical companies to capture data in the healthcare space. Gone are the days of paper tracking for your glucose levels and blood pressure. Instead, wearable devices like watches and trackers can seamlessly provide real-time data streams to applications and third parties. The data collected from wearables can be used for clinical research, patient monitoring, and wellness tracking, among other uses. Each data point collected can add complexity to your broader data set. Because of the amount and complexity of data, turning to machine learning (ML) can help organizations leverage their data to identify patterns and make data-driven decisions. By applying machine learning techniques to wearable device data, we can now surface patterns in big data and make predictions about behavior. Machine learning enables healthcare-related industries to leverage wearable device data and identify trends, improve recommendations, and define research outcomes. Popularity of wearable devices Wearables are popular, and their adoption continues to grow. Globally, the wearable technology market is expected to grow from $69 billion in 2020 to $81.5 billion in 2021, an 18.1% increase, according to the latest forecast from Gartner. What’s fueling the growth? Demand for smart devices in the healthcare sector is rising, as is demand for Internet of Things (IoT) devices. Many devices are not fitness-specific, featuring notification of text messages, push notifications for mobile apps, and the ability to pay for items by scanning a QR code with Google Pay or Apple Pay. As such, they have broad appeal. “As a result of the pandemic, we have seen wearable devices become much more than just activity trackers for sports enthusiasts. These devices are now capable of providing accurate measurements of your health vitals in real time. Improved measurement accuracy coupled with the latest advancements in ML make it possible to detect abnormalities before they lead to a major health event.” – Alex Matsukevich, Fusion Alliance Director of Mobile Solutions Types of data collected by wearable devices There are a variety of brands and categories of wearable devices, from mass-market consumer versions to highly specialized types created for niche uses. Apple, Fitbit, Google, Samsung, Garmin, LG, Sony, and Microsoft dominate the market. Though the concept of “wearables” includes a focus on wristwatches, while exercise equipment, glasses, and textile sensors are also becoming more common. Wearable devices can measure: Sleeping patterns Heart rate Irregular heart rhythms Location/route during exercise Pace, stride, and distance while moving Blood oxygen levels Falls Limitations of wearable device accuracy Wearables do have limitations, and accuracy is a concern. Healthcare decisions made using erroneous data could have outcomes detrimental to a patient’s overall health. A study from the University of Michigan reviewed 158 publications examining nine different commercial device brands. In laboratory-based settings, Fitbit, Apple Watch, and Samsung appeared to measure steps accurately. Heart rate measurement was more variable, with Apple Watch and Garmin being the most accurate, and Fitbit tending toward underestimation. But for energy expenditure (calories burned), no brand was deemed accurate. This does not mean that the results are invalid, but that there is a significant difference between results from wearables and clinical results in a lab setting. Wearable devices are constantly upgraded and redesigned as technology improves. And data collected by wearables does not provide a clinical diagnosis. As such, this data is just part of the larger picture of health and can be used only in conjunction with other factors to evaluate your overall wellbeing. Overcoming the biggest challenge of wearable device data analysis Healthcare professionals are already using ML to analyze data for patients. Research published in the International Journal of Research and Analytical Reviews confirms that ML techniques are successful in predicting health conditions such as heart disease, diabetes, breast cancer, and thyroid cancer. The biggest hurdle to incorporating device data into broader data sets is the addition of new inputs, such as hours of sleep or total steps walked per day. Traditional data points such as total cholesterol or blood pressure readings are less frequent, so there is a smaller amount of data overall. The challenge is finding how to best incorporate it into other data sets to create a more comprehensive picture of health. The future of wearable device data and machine learning We can glimpse into the future of wearable device data and machine learning with Microsoft’s recent patent filing. Their potential product aims to provide wellness recommendations based on biometric data, such as blood pressure and heart rate, pertaining to work events. To do this, Microsoft requests access to applications used by employees. Microsoft then tracks data points such as: Duration of time spent writing emails Number of times a user refreshes their inbox Time spent reading emails Number of corrections made when writing emails Recipient list for emails Number of meetings in a day Tone of language in emails By combining this information with biometric data (from a secondary device such as a Fitbit or Apple Watch) and machine learning, Microsoft could begin to understand what work events trigger a response. For example, suppose an employee received an email from their manager. Microsoft might observe that the employee spent a higher-than-average amount of time reading the email and that the employee’s heart rate was also elevated during this time. Based on these insights, Microsoft could propose recommendations for helping employees manage stress levels, highlighting events that trigger anxiety. Patent filing sample image from Microsoft outlining tips and recommendations to improve employee wellness With a broad user base using both Office and Teams already, Microsoft has a deep understanding of work-related events. As Facebook built their business making sense of our social lives, Microsoft has the potential to optimize our work lives. “Wearables combined machine learning will become the new standard in personalized consumer electronics, rapidly increasing in popularity and scale every year until then. An integrated device of the future will be able to get a baseline of your health and will alert you to any abnormalities present. We already see this happening with the new Apple Watch, and it will be very soon that this technology becomes commonplace.” – Michael Vieck, Fusion Alliance Software Developer Wearable devices will transform healthcare experiences Data is the key to predicting, understanding, and improving health outcomes. IBM Research anticipates that the average person will generate more than 1 million gigabytes of health-related data in their lifetime, equivalent to 300 million books. The sheer volume of data means that machine learning will be vital in making sense of it. Paired together, wearable devices and machine learning have the potential to transform healthcare experiences. Today’s applications and uses are only the beginning. Read more: Top 3 reasons to invest in machine learning for mobile Machine learning and wearable devices of the future Wearing Your Intelligence: How to Apply Artificial Intelligence in Wearables and IoT
Zero Trust: What is it, why you need it, and how to achieve it
More organizations have shifted to the cloud, completely transforming the way business is done. For many, the days of solely relying on big on-premise data centers are gone, now replaced with a combination of on-premise and cloud-based applications. As the way we store and access data changes, we are forced to come up with new ways to improve infrastructure and keep it secure. That’s where Zero Trust comes in. No matter where you are on your Zero Trust journey — maybe you’ve never heard of it, maybe you want to try it but don’t know where to start, or maybe you’re in the thick of it — we’re here to walk you through five steps that will help you understand Zero Trust and how it can elevate your data security. So what is Zero Trust? Zero Trust is a security concept centered on the belief that organizations should not automatically trust anything inside or outside their perimeters and instead must verify anything and everything trying to connect to their systems before granting access. This vendor-neutral design philosophy allows maximum flexibility in designing infrastructure architecture. Every access request is fully authenticated, authorized, and encrypted before granting access. Lateral movement is prevented through security policies and least privilege (minimum permissions to do your job). Rich intelligence and analytics are utilized to detect and respond to anomalies in real time.\ The Zero Trust Maturity Model Traditional This level is where most organizations are at today. Companies who are at this stage have not started their Zero Trust journey, and generally have: On-premises identity with static rules and some single sign-on (SSO). Limited visibility available into device compliance, cloud environments, and logins. Advanced At this level, an organization has begun its Zero Trust journey and has started to make some progress. The areas of adoption at this stage are usually: Hybrid identity and finely-tuned policies that gate access to data, apps, and networks. Devices registered and compliant to IT security policies. Networks being segmented and cloud threat protection in place. Analytics that are starting to be used to assess user behavior and proactively identify threats. Optimal Although the Zero Trust journey is never complete, at this stage an organization has made great strides and improvements in security through the adoption of: Cloud identity with real-time analytics and dynamically-gated access to applications, workloads, networks, and data. Data access decisions governed by cloud security policy engines and secured sharing with encryption and tracking. Complete Zero Trust in the network – micro-cloud perimeters, micro-segmentation, and encryption are in place. Implemented automatic threat detection and response. Steps to achieve Zero Trust 1. Define your protect surface Define your protect surface based on the most crucial data, applications, assets, and services elements for your business. 2. Map the information within your surface There are many ways to map transaction flows, and some techniques for defining your protect surface also apply to mapping its transaction flows. 3. Architect a Zero Trust environment As you develop the architecture, keep in mind ease of operation and maintenance, and flexibility to accommodate protect surface and business changes. 4. Create Zero Trust policy Zero Trust policy is based on the Kipling Method. This shows you how to decide whether to allow or block traffic and how to create a security policy that safeguards each protect surface. Who should access a resource? What application is used to access the resource? When do users access the resource? Where is the resource located? Why is the data accessed — what is the data’s value if lost (toxicity)? How should you allow access to the resource? 5. Monitor and maintain Security is a continuous process as logging and monitoring will reveal needed improvements to make to your policies are your business and infrastructure change. Follow the operational processes you developed when architecting the network to maintain and continually update prevention controls. Running the Zero Trust marathon Zero Trust is a marathon, not a sprint. Since it is not a vendor-specific model, you have the ability to adopt this model utilizing a number of different vendors. If you are ready to start your Zero Trust journey or want to talk about where you’re at, reach out to us today.
Data and analytics in the modern world: How to maximize ROI
Return on investment (ROI) is top of mind for everyone. With so many competing priorities, how you spend your time and money, and what you get for it, matters more than ever. The focus used to completely be on the level of your investment. But the paradigm is shifting because of the data capabilities that now exist. In this article, we’ll explore how the definition of ROI has changed because of modern technology and approaches, where your data ROI comes from, and how to accelerate it. Setting goals for your data analytics efforts Data without analytics is ultimately an investment without return. Most organizations sit on troves of data, but can’t do anything with it. But analytics is a progression. Each of the levels on the data analytics maturity model represent questions you can begin to answer. To go from nothing to cognitive takes a lot — your investment increases substantially. Descriptive: What happened? Diagnostic: Why did it happen? Predictive: What will happen? Prescriptive: How can it happen? Cognitive: What can be suggested? With each step up the model, you add more information and complexity. For example, the descriptive level of maturity can be answered with a look at history. As you progress, you will need more information and stronger data relationships to better understand the “why.” Your data quality and integrity are also important. When you get to the cognitive step, you’re expanding outside your universe of data, and the contextual element of what you’re doing gets broader. For this step, consider Microsoft’s Cortana or IBM’s Watson. But in the modern data world, there doesn’t have to be a huge upfront investment. Shifting your focus from a return on investment to a return on insights can drastically impact how you invest in your data and your results. Calculating the ROI of data and analytics projects Ultimately, ROI is realized from leveraging effective data management to enable access to: more and better data maximizing visualizations advanced analytics actionable insights for outcomes For data management, that means: Improved quality and completeness Confidence and trust Accountability through governance Improved stewardship Advancing culture change to help stakeholders understand the importance and value proposition By improving your data management, the insights from your data become better and more actionable, including: Access to more data and the inclusion of new sources Faster and easier access to data Greater integration of disparate data Easier standup and use of analytics technology In years past, insights from data analytics might have been limited to data scientists or experts in the field. But now, with analytics tools and technologies, data insights are useful to — and actionable for — people across the entire organization. The larger the investment in time and money, the more emphasis on ROI, how quickly it can be realized, and the amount of trailing value. Data leaders have to work with their organizations to understand what the best strategy is – whether that be a smaller investment with a slower return or a big investment that allows you to realize your ROI sooner. It is critical to evaluate your organization’s needs, expectations, and goals before making decisions on strategic data management. Understanding the classic data ecosystem In a classic data ecosystem, the setup might look something like this. In a classic data ecosystem, it requires deep analysis to understand sources and the definition of data, and a considerable amount of time and effort to reach the gold standard you need for your data to be utilized for BI and analytics. Investments are required on all layers. There is no real way to invest in one aspect of your data and analytics and still find value. There is also significant effort required to ensure that as you introduce new systems, you don’t break legacy systems and processes already in place. Quite often, work must be done upfront to ensure that changes (even upgrades) will not cause disruption. In addition, significant “time-to-market” factors need to be considered with classic data ecosystems. Often, the slow delivery of data and features forces businesses to make incremental changes without undertaking any kind of larger project. Doing so might be helpful at first, but can cause issues later. With a slow delivery of data, many organizations using a classic data ecosystem find that they are unable to keep up with the pace of business today. Classic data ecosystems are often built to meet reporting needs, not analytical needs and the analytics piece is a one-off project. The deployment and incorporation of analytical models into production in a classic setup requires a considerable amount of time and customization. Building a modern data ecosystem In this more modern data ecosystem, there is a more layered approach. Now it is much easier to gather, ingest, and integrate the data, and bridge gaps between systems, along with including new concepts like data lakes that can include data layers at the bronze, silver, and gold levels. You don’t have to invest fully in all of the layers, you can invest where you need to. You still have BI & analytics capabilities, but you have more of an application integration framework that serves additional needs. And then there is the trust of the data. This setup allows for more flexibility and customization for all parts of the organization. Read more: How to build your data analytics capabilities The value of incremental change Your investment doesn’t have to be an all-or-nothing proposition. You can incrementally build out components and capabilities and can make data available for exploration without deep upfront analysis that often slows everything down. Additionally, you can control the degree of your investment in a significant way. Without having to push data through all the layers to make it useful and the use of flexible architecture, you don’t have to make a significant investment and change to make it worth it. You can also leverage external tools in the interim. Service and subscription-based features allow for fast initiation, and exploratory efforts can be stood up and torn down easily and quickly. New technologies and design/development paradigms enable faster adoption overall. And now, more user groups are able to access data and analytics, create more use cases, and make business decisions on the insights. Ultimately, it is time to shift your thinking on ROI and leverage modern data technology and tools and focus on the return on insights, intelligence, and innovation. For more information on data, analytics, and assigning ROI, Fusion’s Vice President of Data, Saj Patel, recently spoke at the CDO Summit. His presentation details how to accelerate ROI and gain buy-in across your organization. Watch the recording here and connect with us if you have any specific questions Learn more about Strategic Data Management here.
Using data to drive digital transformation
While every organization’s journey to digital transformation looks different, one thing remains the same — the importance of data. Tackling your data systems and processes is vital to fully transform. However, the reality is that most organizations are overwhelmed with data about their customers. But these troves of information are completely useless unless companies know that the data they have is accurate and how to analyze it to make the right business decisions. In today’s world, organizations have been forced to pivot and have realized the value data can bring to drive insight and empower their decision-making. However, many organizations have also recognized their data immaturity. So how do you move forward? The role of data in digital transformation Data can be your organization’s biggest asset, but only if it is used correctly. And things have changed. A lot of organizations have completed the first steps in their digital transformation, but now they are stuck — they aren’t getting the results they expected. Why? They haven’t truly leveraged their data. According to Forrester, “Firms make fewer than 50% of their decisions on quantitative information as opposed to gut feelings, experiences, or opinions.” The same survey also showed that while 85% of those respondents wanted to improve their use of data insights, 91% found it challenging to do so. So, now that you’ve got the data, how can you make it more valuable? Data strategy is key to your digital transformation With so many systems and devices connected, the right information and reporting is critical. But first, you have to make sure you have the right technology in place. Utilizing big data Although you might feel inundated with the amount of data you have coming in, using big data analytics can bring significant value to your digital transformation. Through big data analytics, you can get to a granular level and create an unprecedented customer experience. With information about what customers buy, when they buy it, how often they buy it, etc., you can meet their future needs. It enables both digitization and automation to improve efficiency and business processes. Optimizing your legacy systems Legacy systems are critical to your everyday business, but can be slow to change. Why fix what’s not necessarily broken? But just because systems are functioning correctly doesn’t mean they’re functioning at the level you need them to — a level that is conducive to achieving your data and digital transformation goals. This doesn’t have to mean an entire overhaul. You’ve likely invested a lot into your legacy systems. One key to a good data strategy is understanding how to leverage your legacy systems to make them a part of (instead of a roadblock to) your digital transformation. With the enormous scale of data so closely tied to applications, coding and deployment can often make this stage of your digital transformation feel overwhelming. Sometimes DevOps tooling and processes are incompatible with these systems. Therefore, they are unable to benefit from Agile techniques, continuous integration, and delivery tooling. But it doesn’t have to feel impossible — you just need the right plan and the right technology. Focusing on your data quality Even with the right plan and technology, you have to have the right data. Bad data can have huge consequences for an organization and can lead to business decisions made on inaccurate analytics. Ultimately, good data needs to meet five criteria: accuracy, relevancy, completeness, timeliness, and consistency. With these criteria in place, you will be in the right position to use your data to achieve your digital transformation goals. Implementing a data strategy with digital transformation in mind So how do you implement your data strategy? You should start by tackling your data engineering and data analytics. The more you can trust your data, the more possibilities you have. By solving your data quality problem, you can achieve trust in your data analytics. And then, the more data you have on your customers, the more effective you can make your customer experience. But, this all requires a comprehensive data strategy that allows your quality data to be compiled and analyzed so you can use it to create actionable insights. The biggest tools to help here — AI and machine learning. The benefits of a data-driven digital transformation The benefits of investing in your data are clear, including increased speed to market, faster incremental returns, extended capabilities, and easier access and integration of data. Discover more about the different ways you can invest in your data and improve and accelerate ROI for your organization. Ultimately, your goal is to elevate how you deliver value to your customers. Digital transformation is the key to understanding your customers better and providing a personalized customer experience for them. Leveraging your data can make all the difference between you and your competitors. And we’re here to help. Learn more about how some of our clients have benefited from investing in their data and digital transformation.
Modern testing standards: A three-layered cake
What’s next for banks? Reimagining the customer relationship with machine learning
The pandemic made it clear that traditional banking is a thing of the past. Online banking had already been on the rise, but a 200% jump in new mobile banking registrations in April 2020 established that customers are able and willing to change. As more Americans bank virtually, banks are fighting to meet customer demand? And beyond the challenges set forth by the pandemic, “digital natives” like Rocket Mortgage, Venmo, Stripe, and Robinhood are all vying for business. These technology-forward organizations position themselves differently from traditional financial institutions and are attracting a younger user base for their services. But a traditional bank has advantages over these challengers: Familiarity and history: Your personal relationships and history with customers mean that your bank is often more aware of their history. And customer questions can be answered in person instead of being routed through a call center. Deep and rich data: Historical data can prove invaluable for ML efforts. Customer deposit amounts, payments, and balance information can be used to predict future behavior. Preference for personal banking: Customers, especially those with a high net-worth, may have discomfort with digital channel dependency for wealth management. A new brand might be a risk, and they could feel uncomfortable not having a specific person to call if something goes awry. As we get back to our “new normal,” traditional banks can use the rich data and relationships they have with customers to their advantage. Forward-thinking leaders are reimagining what it looks like to do business, and they’re using machine learning to elevate the customer experience. Discover how you can use machine learning to create engaging and profitable relationships with your customers. Every bank can find value from machine learning Machine learning might sound like a type of data analysis useful to only the largest of organizations, but its concepts can scale to meet the needs of small and mid-sized banks too. When we use the word machine learning (ML), we are referring to machines and systems that can learn from “experience” supplied by data and algorithms. In banking specifically, ML algorithms can be used to identify patterns in data beyond what humans are capable of observing, and these learnings can be applied to new data sets. It is now possible to improve the customer experience using ML. By parsing customer transaction data, ML can identify clues and patterns ahead of time, even before the customer considers taking action. For example, the process of buying a home and obtaining a mortgage might begin with small savings accumulation or an increase in deposit amounts from wages. ML models can assess banking-specific data like credit patterns, risk tolerance, and price sensitivity, and can be coupled with demographic data like age, median income, and distance to branch. The goal of using ML data in this use case is to target prospective customers with offers most relevant to their situation and stay ahead of customer demands. Knowing where to begin — and where to focus efforts Machine learning has such a wide variety of applications, it can be difficult to know where to start. Identifying a use case for customer-focused ML expenditures is a good first step. In general, we have found that you can benefit from starting with a use case with low or medium relative complexity. Examples focused on improving the customer experience include: Predicting service line interest (HELOC, mortgage refinance, etc.) Streamlining loan approval processes Increasing lines of credit Improving fraud alert notifications With so many use cases to choose from, it can be easy to get lost in the planning for each example. Instead, try focusing on one area at a time. Using your strengths, combined with ML concepts, you can deliver an optimal customer experience that digital challengers just can’t match. Need help getting started? Check out our Machine Learning Jumpstart program. Cross-selling across the relationship with machine learning You can leverage machine learning to determine not only which customers would be a good fit for a mortgage loan, but also the other products that customers might need. Thinking about a mortgage loan, a Home Equity Line of Credit (HELOC) might be a good match for a new homeowner. In any case, the message and product can be tailored to meet the customer’s specific needs. Another part of cross-selling is to personalize the offer based on the customer’s history and propensity to buy. Perhaps an interest rate would be meaningful for one type of customer, while a waived application processing fee would entice another. For individuals who identified as being interested in high revenue products, the marketing effort can be even more personalized, like a phone call or an in-person event invitation. Applying machine learning in real life The following illustration is an example of how an internal dashboard might appear to a banker or service representative. For any specific product, each person has a percentage likelihood that they will take action. Individual model scores are shown, along with next steps, such as outreach about an investment account, or mailing a promotion about mortgage rate refinancing. In this example, marketing inputs, like website data, are combined with transaction and deposit information. When a banker or service representative encounters a customer, either in person in on the phone, they can suggest specific next steps, or ask if the customer has questions. Having a dashboard with this information enables banking employees to be empowered to guide the conversation with data in real time. Related Case Study: Machine learning predicts outcomes at Primary Financial How does a financial services firm improve sales targeting to predict its clients’ desires to invest? Machine learning was the answer for PFC. Learn more. FAQs about machine learning and banking Does the machine learning process work fast enough to enable real-time benefits? For all but the most complex scenarios, yes! Normally, ML is fast enough to be integrated into real-time transactions. Does machine learning get in the way of compliance requirements? In general, no. By using existing data that you obtained or using your data in coordination with third-party data, you are not running amiss of privacy and compliance concerns. How do we ensure the use case we pick machine learning is right given that there are so many to choose from? We recommend focusing initially on those low-cost, high-ROI use cases with a low-medium relative complexity. Given additional experience, context, pipelines, and an understanding of how advanced analytics programs operate, then more complex initiatives can be undertaken. Data reliability can be a concern. Using low-quality data is not advised, but it is possible to start projects with small data sets. Engaging with a third party to evaluate your situation is advised in situations like this. Reimagining customer insights & relationships Banks that employ machine learning will have a portfolio of more customers than ever who are positioned for a variety of banking products delivered in a digital, personalized, and meaningful way. Now is the time to act and implement machine learning to meet customers where they are, using the contact methods that they desire and delivering the products and services to best meet their needs. Need help building your use case and plan? Access our Machine Learning Use Case Guide for Banks. Want to dig deeper? Check out our webinar on this topic.
Exploring cloud modernization challenges and the path forward
When considering cloud migration and modernization, many organizations are focused on moving workloads, reducing costs, and shifting from traditional on-premise data centers. But this is only a small piece of the true modernization opportunity. Organizations must embrace new software delivery paradigms across practices, architectures, and deployment methods to facilitate change in all parts of the organization. By taking advantage of cloud modernization, it is possible to maximize impact, improve the customer experience, and improve the bottom line while doing so. However, some of the biggest problems most modernization projects face are also the most basic: what do we modernize? how do we prioritize and rank projects? when do we begin? In this article, we’ll explore these questions and guide you along in your journey to modernization. Jump to a section: Approaches to modernization Getting buy-in for modernization Prioritizing modernization efforts Choosing which applications to modernize The path to modernization Leveraging cloud and API strategies to enable true digital transformation Approaches to cloud modernization Each application has different needs, and you can move to the cloud in more than one way. Consider all aspects of your current situation as well as the end goal before getting started. Here are five common approaches for cloud modernization, as well as considerations, in order of cloud maturity level from least mature to most mature: 1. Replatform (lift and shift) This faster, less resource-intensive migration moves your apps to the cloud without any code modification. The key is to spend minimal time and money, yet realize the benefits. But not all applications lend themselves to this method, and you might find yourself running into internal and external dependencies. 2. Move, then modernize This strategy is also known as platform as a service (PaaS). In it, you run your apps on a cloud provider’s infrastructure. Developers can reuse languages, frameworks, and containers leveraging code that’s specific to the company. The downside might be missing capabilities, transitive risk, and being tethered to a particular framework. 3. Modernize, then move Support modernization requirements by modifying or extending existing code; then move to the cloud. Using this method, you can leverage the cloud characteristics of your provider’s infrastructure but does also mean incurring upfront development expense. 4. Replace with SaaS When requirements for a business function change quickly, this approach of buying the technology avoids the time and investment of mobilizing a development team. But you may face issues like inconsistent data semantics, difficult data access, and being locked into a vendor. 5. Rebuild or refactor By re-architecting cloud-compatible legacy applications, you can fully realize all the benefits. But this approach typically involves significant recoding. Not all applications warrant this investment, but those that do will find value long-term. Determining your functions Once you determine your overall approach, considering the delivery vehicle for your applications will help determine if you should be using virtual machines, containers, or serverless functions. Modern application design patterns are based on the principles of scalability, statelessness, and high availability — and can still be delivered by using a virtual machine. There’s no standard rule for which is best. Cloud-native development is an approach to building, running, and improving apps based on well-known techniques and technologies. It’s a way to build and run responsive, scalable apps in any environment, whether it be public, private, or a hybrid cloud: Serverless When developers don’t have to provision servers or manage scaling, routine tasks are abstracted by the cloud provider. As an event triggers app code to run, the cloud provider dynamically allocates resources. The user stops paying when the code finishes executing. In addition to the cost and efficiency benefits, serverless frees developers from routine and menial tasks associated with app scaling and server provisioning. Container Containers help alleviate issues and iterate faster across multiple environments. By separating areas of responsibility, conflict is reduced between development and operations teams. Developers can focus on their apps and operations teams can focus on the infrastructure. And, because Linux containers are based on open-source technology, the latest and greatest advancements can be used as soon as they’re available. Virtual machine Everything inside the virtual machines is configured individually for the apps you run in them. The host operating system simply acts as a foundation for VMs to run. With VMs, you can run an entire operating stack in a virtual machine. Related article: 5 keys to planning a successful cloud implementation Getting buy-in for modernization One of the greatest challenges we see IT leaders face today is getting organizational (and budget) buy-in for modernization efforts. Too often, businesses focus on the costs of these initiatives without taking into account the benefits modernization can provide. When putting together your business case for modernization, consider framing up how moving to the cloud is part of your business strategy to: Fight competitive threats and enable you to develop technology that your business requires; you’ll be able to create and deploy new features faster Break your innovation crisis and be empowered to develop new strategies that don’t rely on outdated tech stacks Replace your house of cards, moving away from constraints about fragility and not being able to test properly Attract technical leaders and not have one lone specialist who is the only one capable of repair for a specific system Restructure, merge, and acquire because older versions of middleware can quickly become a drain on budgets Modernization can be thought of as a way to make the most of your existing investments, no matter what your industry. But consider the bigger picture. How can you best meet your overall business goals? What applications would you start with? Related article: The cloud: From cost center to profit center Prioritizing cloud modernization efforts When you’re thinking of leveraging APIs to connect data, applications, or other business elements, legacy systems can be an excellent place to start, as there can be immediate time and cost savings in tackling them first. These highly integrated applications can be complex and challenging. Using APIs to decouple them from the infrastructure can start the process to shift monolithic apps into more granular, cloud-based functions. In considering systems that could be moved to the cloud, the idea of modernization is comingled with migration. Instead of moving from version to version of an application, your modernization plans now probably involve the cloud. This presents a number of unique decisions. You’ll have to make choices about the applications you prioritize to move into the cloud, and the pace to change is rapid. It is not just if you will modernize but when and how fast. For every system, no matter how advanced, technical debt eventually accumulates to the point where change is necessary. Choosing which applications to modernize As organizations move to the cloud, they are creating new ecosystems, placing the customer at the center of everything that happens. Front-office functions are no longer the only consideration to update. Taking a customer-centered approach allows faster response times, increases in speed, and efficiency, and creates advantages over competitors. The customer relationship can be redefined, enabling companies to better understand needs, wants, and purchasing patterns. Consider the applications that power the customer journey across functions and departments: Front office – Customer-facing website and mobile experiences Middle office – Analytics and customer insight apps Back office – Employee-shared services, including records and transactions Focusing on customer-related applications can prove results quickly, and improve buy-in for future cloud modernization projects. The path to cloud modernization When evaluating potential cloud modernization opportunities, there are five key steps, 1) Align; 2) Design; 3) Connect; 4) Implement; and 5) Enable. Alignment and strategy development When planning for modernization, consolidate and charter all elements of the modernization strategy. By partnering with key leaders across the business, you can be sure that all parties are involved and understand the project’s implications. When you are ready, capture and socialize the roadmap across programs and initiatives, getting everyone on board. Remember, cloud is only one part of the modernization puzzle. Design and adoption Once you’ve aligned stakeholders, it’s time to define your north star for the architecture and technology of your approach. Examine key cloud platforms and then capture and socialize reference architecture. You’ll also evaluate the type of microservices, containers, and serverless design. There are many ways to achieve your goals, but do keep in mind that some architectures can become incompatible. Connection and Integration Beyond middleware, integration can connect and allow data to be shared across teams. On the other hand, having separation of data can enable federation and autonomy. Legacy applications can be modernized and, in some cases, replaced. Innovation can be spread across bespoke, SaaS, and legacy investments. APIs are the key to unlock modern architecture. Enabling modernization across the enterprise This is a cyclical process, and when you near the end of implementation, it might be time for another project! The modernization process is always in flux and for almost all, efforts are never fully complete. You’ll continue to learn as you iterate and be able to refine the work you’re doing across the organization. Implementation When it’s time for implementation, it’s important to contain the risk. Make changes across your portfolio one by one. Prioritize your timeline by business case and stick to the reference architecture you built previously. Remember that the path to cloud modernization isn’t just a straight line, it should be circular to reflect an ongoing process. Leveraging cloud and API strategies to enable true digital transformation Digital transformation powers innovation and increases an organization’s ability to meet and exceed customer expectations. However, for companies with outdated IT systems and legacy applications standing in the way of progress, modernization is the way forward. Wherever you are on your organization’s journey to modernization, we’ve got resources to help you along the way. Want to talk? Just let us know.
How Agile product owners should prioritize user stories
Many organizations struggle to adopt Agile as part of their digital transformation. And one of the most common struggles in adopting Agile is how to move toward incremental and iterative refinement. Iterating on a goal allows more information to be gained quickly, but it also means timelines are likely to overlap. While stakeholders tend to focus on their piece of the puzzle or the next goal, product owners have to balance complex timelines and goals for multiple stakeholders. How does an Agile product owner prioritize all the overlapping timelines and goals within the wider context of their organization? There are no solid rules, and everyone has to adapt to their own organization, but many successful product owners factor in the following key principles to make their decisions and plan their sprints: Prioritize based on a mixture of urgency, importance, and size Use tactical stories to achieve strategic goals Find ways to regularly invest in maintenance These principles can increase product ownership success and even lead to greater agile adoption across your organization. As in most things, the trick is balance. Prioritize stories based on Urgency, Importance, and Size Urgency There is always pressure to prioritize urgency. You can’t turn back time, and if you miss an urgent issue then you are certain to hear about it. However, the biggest mistake product owners make is prioritizing urgency too high. There will always be urgent requests, and if urgency is your only factor, you will only end up fulfilling urgent requests. Importance Importance is not always reflected in urgency, but it must always be reflected in priority. Waiting for important work to also become urgent almost certainly spells disaster. The proactive stance of prioritizing important work against urgent work also tends to have political implications. We’ve all heard the adage “the squeaky wheel gets the grease.” The best customers are typically the ones who complain the least, so don’t punish them for that. This also applies to stakeholders. The more you incentivize good stakeholder behavior, the more you are in control of your backlog. Size Of course, urgency and importance are not the only two factors to balance. Part of the difficulty of building a sprint for a product owner is the need to also balance size. Ideally, you want constant progress throughout a sprint. It is common for a feature to get started and then get blocked. That is why you need to include user stories of all sizes. Only having large stories tends to mean stories will get handed off mid-sprint. Knowledge sharing provides value, but mid-sprint handoff only slows progress. If most stories in your backlog tend to be large, small stories will have increased priority in order to fill in a sprint even though they may be of lower importance and only medium urgency. Prioritizing through a mix of urgency, importance, and size doesn’t just create a balanced sprint, it also creates a balanced time orientation. Urgent stories make a team reactive. You can’t look forward when you’re focusing on past mistakes or present pressure. Directing your team’s time orientation beyond the present moment is a key strategy for creating a productive team. Use tactical stories to achieve strategic goals Of course, not every sprint can be a perfect balance of importance, urgency, and size. Like the strategy for directing a team’s time orientation, some choices might be preferred to create or enforce a team culture. While a product owner is the leader of their team, they also answer to stakeholders. And organization objectives, usually data-driven ones, are created with these stakeholders as part of an organizational strategy. The decisions a product owner makes that have a direct impact on reaching these objectives are called tactics. But the product owner is still leading a team! There are many strategic decisions that a product owner makes which may not have a direct benefit to organizational objectives, however, they might lead to a more productive team. Good product owners don’t just focus on tactical decisions of meeting objectives, they refine strategies that benefit the individual team and drive toward objectives in an indirect manner. For instance, retrospectives tend to generate more strategic shifts than tactical shifts. Common examples of tactics used by product owners: Sprints containing stories of varying size Proof of Concept (POC) stories which discover information to do further planning or design Prioritizing the most complex use cases first Prioritizing the most common use cases first Stories that require liaising with other teams, when collaboration is an organizational goal Maintenance which offers innovation affordances Maintenance which increases product stability Common examples of product owner strategies: Cross-training by assigning stories based on areas of weakness Siloing specialties so individuals achieve high efficiency in strategic skills Requiring recorded demos as acceptance criteria for certain stakeholders Requiring regular alpha or beta releases for highly engaged stakeholders to provide feedback “Dead sprints” following releases which allow for feedback to build before addressing it Documentation stories which seek to identify outdated/deprecated documentation and update it in order to lower future bug counts Support buckets, allocated time for bugfixes that can be released ad-hoc instead of on a sprint schedule To clarify, I am not recommending that product owners devote large amounts of resources to tasks which in no way benefit organization objectives. It’s simply a recognition that indirect benefits to objectives lead to success in certain types of organizations. Strategic stories are an intentional investment in greater future value (such as cross-training), although it’s likely that the story could provide a more immediate value to the organization as well. Find ways to regularly invest in maintenance The last key principle faced by new product owners is learning how to please stakeholders while still investing in maintenance. This can sometimes be a challenge because many product owners will get a slap on the wrist if they spend too much time or money on maintenance, and not all maintenance plays are equal. However, I always recommend a portion of each sprint is dedicated to maintenance. There are different types of maintenance, so be sure to find the maintenance tactics that fit your organization’s needs. I’ll highlight a few valuable maintenance story types that have proven effective time after time. Tests Tests almost always are a valuable play. Even for a product that truly requires a higher caliber of quality — where testing is likely a requirement of the product delivery — it is common to discover there are things that lack regular testing. Tests can mean anything from writing automated unit tests for code to walking through the onboarding process to find points of friction. Unit tests make sure untouched features aren’t broken by unrelated changes. End-to-end tests take more time to perform but ensure that the most valuable processes always work. Manual tests, popular for proofs of concept, save the time required to automate tests but take longer to perform, matching a lesser initial investment to the likelihood that something will not be tested often. Manual tests can also be more valuable than unit tests for things that constantly change since overly-specific tests tend to discover more problems with the tests than they do the thing being tested. Find testing methods that work for your product, and don’t forget that processes need tested regularly as well. Support bucket The concept of a support bucket is one of my favorite strategies. It communicates to a team that things worth building are worth maintaining. Refining products regularly shows team members their work is valuable. In cases where there is a strong bug reporting system, it is likely that the team struggles to resolve bugs as fast as they come in. A support bucket is a way to maintain a position of strength. When your team is on top of their bugs, creating a support bucket addresses incoming bugs faster and keeps them from dominating a future sprint. If bugs don’t come in as much as expected, it creates a vacuum that allows team members to define their own work. Many developers relish the opportunity to take two days to refactor code that works but is confusing. However, this is something that can only be done when there are already good tests to ensure it doesn’t cause more problems than it solves. Pre-emptive customer engagement One of the least-used but highest-value maintenance tasks is pre-emptive customer engagement. If bug reports aren’t coming in, that can mean you’ve created a high-quality product! It can also mean you aren’t hearing about difficulties encountered by customers, or customers aren’t adopting product updates for some reason. When user stories are created, it is probably easy to identify real customers to whom this user story will apply. Talk to those customers and ask them how they are using the new features. How to prioritize user stories in Agile All the talk about priority, strategy, tactics, and maintenance is great, but how do I balance ALL of that when assembling a sprint? Much of this depends on the leadership and the number of stakeholders you’re dealing with. Organizational politics can send blue-sky planning into a tailspin. In the real world, you have to not only do good work, but make sure that work gets seen by the right people, and that they are impressed. If your organization’s leadership prioritizes innovation and “shiny things” This is a sign that part of your work is about the ideas and the culture shift. This is a completely valid position for a leader. There are many political advantages to trying new things and building a culture of innovation. It is a great way to gather information for refining organizational strategy. In cases like these, the following breakdown of average sprints works well: 50% strategies of short-term investment that can get thrown out or create a culture of innovation 25% tactics supporting a long-term organizational strategy 25% maintenance Applicable tactics/strategies: POCs, specialized efficiency, regular alpha/beta releases, prioritizing common use cases first, maintenance offering innovation affordances If your organization’s leadership prioritizes long-term strategy, stability or predictable growth This is a sign that part of your job is about giving customers confidence in your products. This is not an anti-innovation position, rather it is a recognition that you are in a position of strength in your market. In cases like these, the following breakdown of average sprints works well: 50% tactics supporting a long-term organizational strategy 25% strategies to shape the team 25% maintenance Applicable tactics/strategies: cross-training, prioritizing most complex use cases first, maintenance improving product stability, improving documentation, support buckets, external liaisons Closing If you’re a product owner or your organization is beginning to adopt Agile, take a look at your organization to find out what your stakeholders truly care about. While every organization is unique, these factors are universal. Tailor your strategies to your team and your tactics to your objectives. Most of all, create balanced sprints which deliver constant progress toward your goals. Analyzing urgency, importance, and size can lead to solid prioritization that drives team success.
The future of software testing: Trends for 2021 and beyond
In today’s age of digital transformation, companies have had to change the way they test software. Teams performing 100% manual test scripts find it very difficult for the testers to keep up with development. We often see developers either having to slow down development so the test team can keep up – or developers and others have to jump in to help get the testing completed on time. In either case, there is a negative business impact and reduction in test coverage. Testing trends As the reality of software testing has evolved, here are the trends we are tracking for 2021 and beyond: Artificial intelligence and machine learning test tools Modern web applications are highly dynamic. Developers are continually modifying and introducing new features to increase quality and meet the needs of stakeholders. But rapid change requires maintainable and reliable tests. Enter artificial intelligence (AI) and machine learning (ML). These technologies enable tests to detect changes in the application and adjust accordingly. Less time fixing tests leads to more time finding critical defects, which ultimately results in higher-quality software. Shift left toward testing at the integration layer In the waterfall model of the software development life cycle, much of the testing happens near the end (on the right end of the spectrum). Agile development, on the other hand, incorporates testing throughout the entire process. As a result, many testers are shifting left (i.e., performing tests earlier). In particular, testing at the integration layer with automated API tests will continue to pay dividends as release cycles shorten and microservice architecture is implemented. Performance & load testing While testing various aspects of performance has always been an essential part of software development, there is a renewed interest in load testing in particular. Due to an increasing number of open-source tools (i.e. Locust, Gatling) and a move toward cloud-hosted solutions such as LoadNinja and BlazeMeter, load testing is becoming more accessible to non-experts. Additionally, the ability to integrate performance tests into the CI/CD pipeline provides continuous feedback on how releases will perform under expected user loads. Look for load testing to grow this year and beyond. “Low Code/No Code” tools Building coded test automation frameworks, though worthwhile, requires a large investment of time, skill, and money. This can lead smaller companies to think of test automation as beyond their resources. Yet, the recent advent of powerful new testing tools requiring minimal or no coding has shifted the balance. With easy-to-use open-source platforms like the Selenium IDE and TestProject (along with paid products like Mabl), automation of web and mobile apps, as well as APIs, is no longer out of reach. While these tools are no replacement for fundamental testing knowledge and skill, they can certainly lower some of the barriers to implementing test automation. To Selenium & beyond For years, Selenium WebDriver and its derivatives have dominated the market in web and mobile user interface testing. With the advent of Selenium 4, this powerful framework isn’t going anywhere soon. However, a number of newer players are making themselves known in this space, including TestCafe, Microsoft Playwright, and Cypress. Each of these frameworks have their own advantages and disadvantages, and it is becoming increasingly important for organizations to be able to evaluate which option best fits their unique needs. Looking forward Testing is an integral part of the business process and any digital transformation. Skillfully implemented, these testing trends are contributing to increased efficiency, reduced costs, and better software. Increased test automation, AI/ML tools, greater technical proficiency, and shifting left all hold the promise of delivering greater value to the organization. Companies overwhelmed with testing should work with experts who understand best practices and what type of testing is right for what scenario. If you’re looking for assistance or would like to learn more about our software testing capabilities, contact us today.
Reduce security breaches: How Cloud Identity as a Service can benefit you
Moving your business to the cloud is inevitable. And it is critical that you can keep your data, your employees’ data, and your clients’ data safe. Global insurance carrier Hiscox reports the average cost to recover from a data breach is $200,000, whereas a study by the Ponemon Institute (sponsored by IBM) estimates the average cost to be $3.92 million. With data and applications now largely in the cloud and an onslaught of mobile-workplace devices accessing your systems, the focus is no longer on the network. Identity and access are now center stage, and the weight of effectively managing these in the cloud is on your shoulders. But it doesn’t have to be. Cloud Identity as a Service (IDaaS) is a cloud-based subscription where you pay a third party to manage your identities and access in the cloud, over the internet. In this article, we’ll explain how identity and access management through IDaaS can benefit your company, and you’ll learn about three factors you need to consider before you choose a platform. Why Cloud Identity as a Service? With 90% of companies in the cloud, Cloud Identity as a Service platforms have made it easier than ever to provide commercial and enterprise customers with rich and highly secure web experiences across many applications. Whether your customers log in with standard credentials (username and password), social identities (such as Google or Facebook), or their corporate credentials, top vendors such as Microsoft, AWS, Okta, and Auth0 all provide the ability to natively authenticate with dozens of providers. Companies prefer to offload identity and access management because IDaaS costs less than you would pay to repair a breach and mitigate damage. While estimates vary greatly (depending on many variables, including the type of hack, degree of connectivity, and how the study defines “recovering” from a beach), the bottom line is that a single company’s internal resources are no match for the expertise and layers of security measures implemented by cloud providers. So where do you start? 95% of security breaches in the cloud will be caused by customers. Gartner prediction 4 benefits to implementing a cloud identity platform Here are the top four ways your company will benefit from a cloud identity platform: 1. Improve total cost of ownership and reduce risk Your company no longer needs to store sensitive passwords in a database, stay up to speed on the latest cryptographic algorithms, or implement the latest single sign-on protocols. This is all managed for you in the cloud identity platform. 2. One login across multiple services As companies move away from the monolithic application to the microservice, it’s becoming more painful to manage authentication across services. Token-based, single sign-on allows users to move seamlessly across applications and services within your organization. 3. Corporate and social providers easily accommodated IT departments are mandating corporate vendors and partners honor their corporate credentials for accessing web-based systems. Cloud identity platforms make this simple by supporting standard protocols like OpenID Connect and SAML to onboard new customers in a matter of hours. 4. Decreased risk through multi-factor and password-less authentication Passwords are insecure. At its Ignite conference in 2020, Microsoft revealed that it now has over 150 million users authenticating without passwords, and the world is a more secure place because of it. This is made possible by leveraging other modes of authentication such as mobile-based one-time passcodes and authenticator apps that allow users to verify their identity by what they have, not what they know. 3 factors to consider before you choose a cloud identity platform At this point, you might want to start researching your different options. Despite being simple for the end user and administrator, there are some very important design considerations to think about as you compare cloud identity platforms. 1. Price Cost savings is a huge factor for most companies switching to IDaaS. On-site identity management often comes with cost of servers, software costs, maintenance and upgrade fees, and the cost to actually manage the security. But IDaaS saves you from all of that − typically you’re simply paying for the subscription fee. The subscription will look different depending on how you’re planning to use the platform, and going into the conversation knowing what you want can allow you to only pay for what you need (e.g., number of identities, frequency of authentication, etc.). 2. Configuration options All vendors, either loosely or strictly, conform to the standard OAuth and OpenID Connect protocols for issuing tokens. Unfortunately, some vendors introduce their own terminology, and the specification itself is lengthy and complex. If misconfigured, it is far too easy to end up with a system that is unmaintainable at best and insecure at worst. Having a trusted partner who has implemented these systems is key to success. 3. Additional security features The top vendors also include advanced security features, such as brute-force detection, anomaly detection, breached passwords, and advanced logging and analytics. All of these features give you and your customers added protection against attackers trying to gain access to your systems. You’ll have improved cybersecurity and be saving time with fewer password resets and faster logins. What’s next? With all of your data moving to the cloud, and employees and clients conducting business on personal devices, how can you make it work for you? Cloud Identity as a Service can be the answer. So, if you’re ready to move to a platform that delivers the security and peace of mind you need for your business, or just want to figure out where to start, talk to us.
The rise of big data technologies and why it matters
Getting the right data to the right people at the right time is the name of the game in today’s demanding marketplace. Every company has to find a way to harness big data and use it to drive growth. And if your organization isn’t talking big data, you are at a competitive disadvantage. This article covers a top-level view of big data’s evolution and key components. It can help you understand the importance of big data and technologies that are essential to its discussion. With this foundation, you can proceed to the next step — addressing what to do with your data and how. Just how much data exists? Every passing moment, the pace of data creation continues to compound. In the time it takes you to read these few paragraphs there will be: more than 200 million emails sent millions of dollars in e-commerce transacted 50 hours of YouTube videos uploaded millions of Google searches launched tens of millions of photos shared Every few minutes, this cycle repeats and grows. In 2019, 90% of the world’s digital data had been created in the prior two years alone. By 2025, the global datasphere will grow to 175 zettabytes (up from 45 zettabytes in 2019). And nearly 30% of the world’s data will need real-time processing. Over the last decade, an entire ecosystem of technologies has emerged to meet the business demand for processing an unprecedented amount of consumer data. What is big data? Big data happens when there is more input than can be processed using current data management systems. The arrival of smartphones and tablets was the tipping point that led to big data. With the internet as the catalyst, data creation exploded with the ability to have music, documents, books, movies, conversations, images, text messages, announcements, and alerts readily accessible. Digital channels (websites, applications, social media) exist to entertain, inform, and add convenience to our lives. But their role goes beyond the consumer audience — accumulating invaluable data to inform business strategies. Digital technology that logs, aggregates, and integrates with open data sources enables organizations to get the most out of their data, and methodically improves bottom lines. Big data can be categorized into structured, unstructured, and semi-structured formats. The development of modern data architecture Until recently, businesses relied on basic technologies from select vendors. In the 1980s, Windows and the Mac OS debuted with integrated data management technology, and early versions of relational database engines began to become commercially viable. Then Linux came onto the scene in 1991, releasing a free operating system kernel. This paved the way for big data management. What is big data technology? Big data technologies refer to the software specifically designed to analyze, process, and extract information from complex data sets. There are different programs and systems that can do this. Distributed file systems In the early 2000s, Google proposed the Google file system, a technology for indexing and managing mounting data. A key tenet to the idea was using more low-cost machines to accomplish big tasks more efficiently and inexpensively than the hardware on a central server. Before the Information Age, data was transactional and structured. Today’s data is assorted and needs a file system that can ingest and sort massive influxes of unstructured data. Open-source and commercial software tools automate the necessary actions to enable the new varieties of data, and its attendant metadata, to be readily available for analysis. Hadoop Inspired by the promise of distributing the processing load for the increasing volumes of data, Doug Cutting and Mike Cafarella created Hadoop in 2005. The Apache Software Foundation took the value of data to the next level with the release of Hadoop in Dec. 2011. Today, this open-source software technology is packaged with services and support from new vendors to manage companies’ most valuable asset: data. The Hadoop architecture relies on distributing workloads across numerous low-cost commodity servers. Each of these “pizza boxes” (so called because they are an inch high and less than 20 inches wide and deep) has a CPU, memory, and disk storage. They are simple servers with the ability to process immense amounts of various, unstructured data when running as nodes in a Hadoop cluster. A more powerful machine called the “name node” manages the distribution of incoming data across the nodes. By default, data is written to at least three nodes and might not exist in its entirety as a single file in any one node. Below is a simple diagram that illustrates the Hadoop architecture at work. Open source software The majority of enterprises today use open source software (OSS). From operating systems to utilities to data management software, OSS has become the standard fare for corporate software development groups. Serving as a progressive OSS organization, Apache Software Foundation is a non-profit group of thousands of volunteers who contribute their time and skills to building useful software tools. As the creators, Apache continuously works to enhance Hadoop code — including its distributed file system called Hadoop Distributed File System (HDFS) — as well as the code distribution and execution features known as MapReduce. Within the past few years, Apache released nearly 50 related software systems and components for the Hadoop ecosystem. Several of these systems have counterparts in the commercial software industry. Vendors have packaged Apache’s Hadoop with user interfaces and extensions, while offering enterprise-class support for a service fee. In this segment of the OSS industry, Cloudera, Hortonworks, and Pivotal are leading firms serving big data environments. Now software systems are so tightly developed to the core Hadoop environment that no commercial vendor has attempted to assimilate the functionality. The range of OSS systems, tools, products, and extensions to Hadoop include capabilities to import, query, secure, schedule, manage, and analyze data from various sources. Storage Corporate NAS and SAN technologies, cloud storage, and on-demand programmatic requests returning JSON, XML, or other structures are often secure repositories of ancillary data. The same applies to public datasets — freely available datasets, in many cases for economic activity by industry classification, weather, demographics, location data, and thousands more topics. Data of this measure demands storage. Distributed file systems greatly reduce storage costs while providing redundancy and high availability. Each node has its local storage. These drives don’t require speed or solid-state drives, commonly called SSDs. They are inexpensive, high-capacity pedestrian drives. Upon ingestion, each file is written to three drives by default. Hadoop’s management tools and the Name Node monitor each node’s activity and health so that poorly performing nodes can be bypassed or taken out of the distributed file system index for maintenance. The term “data lake” describes the vast storage of different types of data. These vastly different data sources are in a minimum of a dozen different file formats. Some are compressed or zipped. Some have associated machine data, as found in photos taken with any phone or digital camera. The date, camera settings, and often the location are available for analysis. For example, a query to the lake for text messages that included an image taken between 9 p.m. and 2 a.m. on Friday or Saturday nights in Orlando on an iPhone would probably show fireworks at Disney World in at least 25% of the images. Administration of big data initiatives The enterprise administration of applications — their storage requirements, security granularity, compliance, and dependencies — required Hadoop distributions to mature these capabilities in the course of becoming a managed service to an enterprise (like those from Cloudera and Hortonworks). In the graphic above, you can see a view of Hadoop’s place among other software ecosystems. Note that popular analysis tools (below) are valuable in developing big data solutions: Excel and Tableau databases such as SQL Server and Oracle development platforms such as Java and Informatica Data Quality Administration through Cisco and HP tools is common. Further development of big data Commercial software companies have begun connecting to Hadoop, offering functionality such as: data integration quality assessment context management visualization and analysis from companies such as IBM, Microsoft, Informatica, SAP, Tableau, Experian, and other standard carriers. Analytics and data science Analytics is the endgame for developing a big data environment. The rise of big data has given credence to a new resource classification, the data scientist — a person who embodies an analyst, technologist, and statistician all in one. Using several approaches, a data scientist might perform exploratory queries using Spark or Impala, or might use a programming language such as R or Python. As a free language, R is rapidly growing in popularity. It is approachable by anyone who is comfortable with macro languages such as those found in Excel. R and its libraries implement statistical and graphical techniques. Moving to the cloud Cloud computing is very different from server-class hardware and software. It involves cloud storage, multi-tenant shared hosts, and managed virtual servers that are not housed on a company’s premises. In cloud environments, an organization does not own equipment, nor does it employ the network and security technologists to manage the systems. Cloud computing provides a hosted experience, where services are fully remote and accessed with a browser. The investment to build a 10- or 20-node HDFS cluster in the cloud is relatively small compared to the cost of implementing a large-scale server cluster with conventional technologies. The initial build-out of redundant centers by Amazon, Microsoft, Google, IBM, Rackspace, and others has passed. We now have systems available at prices below the cost of a single technician. Today, cloud computing fees change rapidly with pricing measured by various usage patterns. Conclusion: The rise of big data is evident Big data is not a fad or soon-to-fade trending hashtag. Since you began reading this article, more than 100 million photos have been created, with a sizeable portion having a first-degree relationship to your industry. And the pace of data creation continues to increase. The distribution of computing processes can help organizations to gain a 360-degree view of their customers through big data collection and analysis. And companies that embrace big data technologies and solutions will rise ahead of their competitors. Big data technologies are becoming an industry standard in finance, commerce, insurance, healthcare, and distribution. Welcoming big data technologies and solutions is key in the optimization and continued growth going forward. Companies that embrace data solutions can continue to improve management and operational processes and create a competitive advantage to withstand an ever-evolving marketplace.
Digital transformation: Our predictions for 2021 and beyond
What will the future hold when it comes to digital transformation? We don’t have a Magic-8 ball or special spidey sense, but our team does anticipate sizeable change. We asked a few of our team members what they thought based on their work and personal experiences. Here’s what they’re envisioning for 2021 and beyond. The cloud gains more ground I expect cloud-based business application platforms such as Dynamics 365, Salesforce, ServiceNow, and Workday to drive significant digital transformation within modern workplaces in the next year. Following on the heels of many core infrastructure services moving to the cloud — such as email, servers, files, and data — the next major lift for many organizations will be to modernize and automate their core business processes. I anticipate areas like finance, HR, production, and other critical business operations and workflows will be the next major shift to using cloud-based business application platforms. Moving away from legacy, on-premise solutions is not always a simple task, but in doing so, employees can then work remotely without being tethered to an office environment. Greg Deckler, Vice President, Cloud Services Connect with Greg on LinkedIn Remote workers collaborate differently The work-from-anywhere model has been proven to work and it will continue. However, right now, a Zoom meeting is about the extent of what most people see as remote teamwork — and we all know those can be exhausting. I predict greater adoption of tools like Miro and Mural. These online workspaces allow for active collaborating and co-creating in real time. The need to move quickly and keep pace with digital transformation will require these types of tools, and those who know how to leverage them, to make the most of a remote team’s time together. Doug Scamahorn, Solution Director, UX Design & Innovation Connect with Doug on LinkedIn Cookie compensation I think 2021 will be the year when businesses and marketers confront the pending deprecation of the third-party cookie. Google is driving the industry towards new solutions for retargeting and attribution following the announcement that Chrome will cease to support third-party cookies in 2022. While industry players debate over a long-term replacement, expect to see a scramble to shore up first-party data in the meantime. At a tactical level, this will look like increased pushes for “registered” online experiences where users must explicitly identify themselves, as well as the integrations that power these points of data collection. In the background, businesses will be pushing to connect the dots between online and offline touchpoints using a variety of identifiers, from email to devices to data from “walled gardens” like Amazon, Facebook, and even Walmart and Target. Companies may opt for a CDP (consumer data platform) solution on top of their existing data stack to manage data points specifically for targeted marketing campaigns. When reporting on campaign success and attribution, analysts may need to adopt new tools and strategies for managing “fuzzier” readouts on customer behavior and journey identification. Amy Brown, Solutions Director Connect with Amy on LinkedIn Augmented reality becomes actual reality As mobile processing and bandwidth progresses and matures, we can expect more augmented reality (AR) apps to provide visual assistance in a huge range of applications. I fully expect we will see vehicles with heads-up displays, smart glasses (remember Google Glass?), and other clear displays to be adopted by more companies and thus, individuals. Visual processing in itself is gaining in popularity. Retailers like IKEA are already using AR with their IKEA Place app to enable customers to “see” furniture in their spaces. Microsoft’s recent HoloLens release is a good example of where we’re headed. Jeremy Keiper, Competency Lead Connect with Jeremy on LinkedIn B2B marketers will get more creative There’s always been an understanding that marketing is both an art and a science. Over the last decade, marketers have leaned into the science. Data provided marketers with information about customer behavior that was never available before. Even before the pandemic, B2B marketers were relying heavily on digital channels to engage customers. But pandemic office closures caused marketers to rely on channels like email, webinars, social media, and search engine marketing (SEM), in an attempt to reach prospective buyers who were now working from home. And they had to get creative. Marketers had to be willing to test new ideas and try things that haven’t been “proven,” and to think creatively about how we connect with and engage prospects and customers. I anticipate this to continue, and marketers will use customer data to make sure they understand consumer goals and motivations, then get creative about how to reach out and connect. Kristin Raikes, Sr. Director of Digital Strategy Connect with Kristin on LinkedIn Looking ahead Thinking about the year ahead, we do know that even after offices reopen and things get back to “normal,” the new “normal” will look different than it did before. If people continue to work from home or prefer to engage with brands virtually versus physically, then technology will have to adapt. Are there any major trends not listed above that you think will be a key to digital transformation for this year? If you have questions about specific trends, you can also connect with our team via their LinkedIn profiles above. Our consultants and team members work with clients to improve, streamline, and create actionable change. We create exceptional customer experiences by leveraging data insights, experience design, and technology to transform the way you connect with your customers. Interested in learning more? Let us know, or sign up for our newsletter to get to know us.
Driving digital transformation forward
Everyone is talking about digital transformation, but there’s a lot of confusion and misinformation about what that term means. Much more than just a buzzword or doing things “digitally,” digital transformation means reimagining your business and driving it forward in a better way. When done correctly, digital transformation can fundamentally change how you deliver value to your customers. That’s what you need to focus on if you want to thrive in this market. Customers today have high expectations and a lot of options, so carefully considering your customer journey is critical to success. Before you get started, here’s an overview of what you need to know about digital transformation and how to continue to drive it forward: What exactly is digital transformation? The evolution of digital transformation Cloud computing fuels digital transformation More than just technology Keys to successful digital transformation The challenges of digital transformation Where are you in your journey? What exactly is digital transformation? Digital transformation is the integration of technology into all areas of a business, fundamentally changing how you operate and deliver value to customers. It can often be confused with digitization or digitalization, but they are all different things altogether. Digitization is the process of taking information and converting it from a physical format to a digital one, like typing up paper notes and putting them into Microsoft Word or scanning into a PDF. Digitalization, on the other hand, uses these digital files to take the processes already in place and make them more efficient using the new digital format. Digitalization allows you to make your processes faster but doesn’t evolve the process itself. Digital transformation does more by enabling you to interact with your customers in a new way that is constantly evolving to meet both their needs and your business needs. By ideating and implementing better business processes and technologies, you’ll not only create an elevated customer experience that will result in increased profitability, but you’ll also save significant time and money in operating costs. The evolution of digital transformation Digital transformation might be a popular term today, but this was also true in the late 1990s through the mid-2000s. It started with companies computerizing processes 30 years ago, and as the internet was established, websites started to connect companies with their customers. That’s when Fusion Alliance got started, and we’ve been focused on the end goal of developing solutions for clients ever since. Digital processes emerged to support customer interactions, from sending emails to managing online ordering. As digital ambitions grew, companies realized a need for dedicated digital teams to manage social and mobile channels. Connected to customers, suppliers, and other stakeholders, companies realized the need to connect all of these data silos. Seeing the potential in connectivity, organizations focused on digital platforms connecting all systems. Then they started to experiment with new digital ways of doing business, leveraging data more effectively, and creating greater agility. Today, we live in a world where customer expectations have never been greater. They demand personalized experiences in every interaction with a company’s products and services. Because of this, companies must innovate quickly and deliver. In addition, the pandemic has forced IT leaders to adapt yet again, with many adopting cloud software for video collaboration and building apps that enable workers to enter offices governed by social distance practices and contact tracing. Technologies such as cloud computing, the Internet of Things, and Artificial Intelligence all power the innovation that is able to deliver value. An important element of digital transformation is technology. But often, it’s more about doing away with outdated processes and legacy technology than it is about adopting new tech. It should also about enabling innovation. Cloud computing fuels digital transformation Companies are increasingly moving toward a hybrid cloud infrastructure. From SaaS applications and on-premise solutions to a mix of public and private clouds, hybrid cloud strategies help companies find the right balance for their unique infrastructure needs. Over the past year, cloud providers like AWS, Azure, Google, IBM, and Oracle have made investments to support hybrid strategies. OEMs like HPE, Dell (VMware), and Cisco have also increased investments in tools that enable simpler connectivity between on-premises data centers and the cloud. These investments are all centered on meeting the customer where they are in the moment. Hybrid cloud adoption was already underway before 2020’s pandemic, but the sudden disruption sped things up. Being agile and nimble was, and still is, a significant business advantage. More than just technology Although the focus of digital transformation is often on emerging technologies moving the business forward, true transformation has to encompass much more. Business leaders at the top level must be involved in this change and must be willing to invest in and empower your employees, as well as focus on building your culture. As employees begin to see changes occurring around them, they might begin to wonder how it will affect them and their coworkers. They may question their own position within the organization and try to figure out their own best next steps. During any transition, leadership must communicate with employees to help them feel secure in their own positions as well as the direction of the organization. Head off potential issues by making sure your employees understand what digital transformation is, what it’s not, and how they can be a part of it. Additionally, it may be best to communicate how rapidly things can change throughout this process. Being as transparent as possible and preparing your employees for future changes can make all the difference in ensuring your digital transformation is successful. Keys to successful digital transformation Data from McKinsey shows that when companies achieve transformation success, they are more likely to have digital-savvy leaders in place. Less than one-third of all respondents say their organizations have engaged a chief digital officer (CDO) to support their transformations. Those that do are 1.6 times more likely than others to report a successful digital transformation. Companies that have already invested in data and infrastructure to support technology efforts are more adept at again succeeding. The keys to finding success with digital transformation projects vary because no two companies are the same. However, we do have a few recommendations: Define what digital transformation means to your business Create a map of where your business is now, including people, process, data, and technology. Then define where your business needs to be. The gap identified becomes a roadmap. Defining the gaps are in the business is the first key step in your digital transformation process. Identify and involve the right internal stakeholders Leaders and decision-makers might not have the insider knowledge to execute technical challenges. Your stakeholders should be throughout the organization, across departments, and exist at different leadership levels. Align with a partner to shepherd the process You might find your business well-suited to tackle projects outside your normal scope of work, but many companies lack the internal resources to be able to undertake larger projects. Finding an outside partner might be helpful to keep things moving and provide direction with next steps. The challenges of digital transformation While digital transformation is worth it and necessary to the survival of your organization, there are challenges that come along with reimagining any new business process. To make your transformation truly successful, ensure that you have a good understanding of your brand and your customers. Without that, the entire digital transformation strategy you build will be misguided, and you’ll end up back on square one. Budget can also be a big hindrance for a lot of companies looking to begin their digital transformation. Often, there are additional resources and training needed upfront. Although the cost savings and increased profitability are worth it in the end, the initial expense can be intimidating. Additionally, poor data quality can become a huge challenge for organizations as they go down the digital transformation road. Poor data analytics often leads companies to make important decisions based on misguided data. Poor data also prevents companies from being able to use emerging technologies like artificial intelligence and machine learning, since they prove useless when fed with bad data. Sound strategy and good data will ensure you start your digital transformation journey on the right foot. Where are you in your journey? Whether you’re an established business or a startup, the perfect time for digital transformation is now. No matter how old your business is, your digital transformation will be unique to your organization. It is helpful to think ahead to the future — where do you see your company in the next 5 years? Create a roadmap of where you want to be in terms of customer experience, technology, and data insights, and involve those team members in your planning discussions. By involving decision makers across the organization, you can ensure that you are aligning all parts of your business with the end goal. And, if you can, bring in an experienced third-party to help streamline planning, bring in additional insights, and ensure that your projects run smoothly. Digital transformation allows your organization to deliver the right customer experience to the right people and not only remain relevant in your market, but actually build your brand. One thing is consistent: customers today expect a flawless and customized customer experience, and you’ll need digital transformation to deliver it. Your digital transformation partner Customers today expect a customized customer experience, and you’ll need digital transformation to deliver. When you’re ready, we’re here to help you execute. Need help addressing or assessing where are on your organization’s digital transformation journey? Let us know.
When testers become plumbers: Testing in the DevOps pipeline
Are you currently using a DevOps pipeline, or moving that direction? Are you looking to integrate testing into the release pipeline? In this article, we’ll briefly discuss DevOps pipelines and give some hints on how you can test in such an environment. Joining the DevOps moment As you participate in the never-ending dance of software development and releases, you’ve probably come across the term “DevOps.” Though it sounds like an elite team of special operatives, the portmanteau is more accurately described as a movement to blur the lines between the traditionally-siloed worlds of development and operations. In an on-premise software-deployment environment, isolating development and operations has made sense. Operations can focus on releasing the software to the appropriate customer through a medium that users can consume. The developers are 100% dedicated to adding new features and making sure they don’t break the build, and they don’t need to worry about the details and demands of delivering to individual customers. Yet software is increasingly moving “to the cloud.” With cloud apps and services being deployed to locations hosted by software companies (rather than users), the load of operations is eased. Simultaneously, this movement is eroding the argument that development and operations should continue to be siloed. “DevOps pipelines” are a powerful tool when uniting the two functions. With the adoption of continuous integration/continuous deployment (CI/CD) with tools such as Jenkins, Travis CI, Gradle, and Bamboo, you can put together a sturdy, reliable pipeline to handle everything from initial build to release. Developers can now push to a code repository and then watch an entirely automated process release that code into your desired environment. Common Pipeline Components and Corresponding Tools Is quality going down the drain with a pipeline?? If the software-release process is handled by an automated pipeline, is there still the opportunity to test the code before it’s released into the wild? Do I still need testers? Absolutely! Adopting a pipeline does not mean software testing should be sacrificed to streamline the release process. The very basis of DevOps pipelines is to make the release process reliable and easily repeatable while reducing the chances of errors, delays, and miscommunication. All of that contributes to higher quality products, right? New or existing automated tests can be easily and smoothly integrated into the pipeline. Unit tests, smoke tests, and regression suites can all be run as part of a release pipeline. The testing is not limited to API tests, either. CI tools such as Jenkins or third-party testing platforms like Mabl can run your UI tests in “headless mode” to avoid interrupting team members’ work whenever the pipeline runs. “Continuous testing” is a significant opportunity with CI/CD. Let’s walk through a sample scenario. Say several developers have reviewed and approved a code merge to the develop-branch of a code repository. Once the code is merged, the CI/CD tool picks up the changes in the code repository and tells the build server to build the code. If the build is successful, the code is ready to be deployed by another tool. Unit tests (provided by the developers) should be run by the build server as a sanity check. Then, a suite of automated software tests can be kicked off against the successful build as a further check that the code changes have not broken existing features. If any of these steps fail, the pipeline will notify the responsible parties that it’s a breaking build, and deployment is terminated. A successful run will result in deployment to the dev environment. The pipeline is flexible The scenario above describes only one potential setup. The multitude and variety of available tools allow for incredible flexibility! If manual tests are required, the pipeline can be configured to build the code and then wait until the appropriate user approves the deployment. Does this defeat the point of using a pipeline? Not at all! The pipeline keeps all of the steps in order, and any part that should be automated in your specific system can be automated. Steps won’t be forgotten or completed out of order. Logs and results are kept in the pipeline for future reference and easy visibility into the status of releases. So where do I start? You could certainly begin by compiling a pipeline from multiple tools. Many of the individual tools have excellent documentation with notes on how to integrate with other components. If you are new to creating a pipeline, however, we’d recommend checking out GitLab. It is a single tool that can handle source code management, continuous-integration/continuous-delivery, builds, and releases. With GitLab, your pipeline can be built & managed in a single place. Your software can be tested on the pipeline using your automated test suites stored in GitLab, or by running a command to kick off tests living somewhere else. The source code management is based in Git and is very similar to GitHub, with one prominent difference being the use of “merge requests” rather than “pull requests.” Same basic concept, just different terminology. Example of GitLab’s pipeline diagram, displayed while running Another popular CI/CD tool is Jenkins, an established and well-supported automation server. Like GitLab, Jenkins provides a UI for pipeline dashboards and interaction. Jenkins, however, is not an all-in-one CI/CD solution. While GitLab hosts the resources it uses in the pipeline, Jenkins is more of a coordinator that calls different pieces of your pipeline from wherever they are hosted. It is an exceedingly useful and time-tested tool for scenarios where you want to utilize several different platforms in one pipeline. Jenkins can be configured to build, test, and deploy code once it is pushed to a repository. Jenkins dashboards can show multiple projects in one view If the pipeline fits… As support for DevOps continues to grow, so does the number of available tools for adding automated tests into your pipeline. A well-built pipeline is incredibly flexible and should never require teams to reduce software testing, even if some of that testing is manual. Tools such as GitLab and Jenkins provide an excellent starting place for building a sturdy software-release pipeline as you begin your DevOps journey. So, go ahead. Put on those coveralls and grab your wrench. It’s time to build your software release pipeline and join the DevOps movement. And if you’re struggling with where to start or need assistance, don’t hesitate to get in touch. We’re always here to help!
6 ways accessibility will impact businesses and website design in the future
Most people take websites for granted. They pay bills, book flights, and download white papers online with relative ease. But not everyone assumes that digital tools are designed with them in mind, and that’s a failure for everyone. One in four U.S. adults report having a disability that impacts major life activities. That’s 61 million friends, neighbors, and family members who deserve a digital experience that is just as user friendly as anyone else’s. Integrating website accessibility into your design process and culture is a step toward addressing this very human problem of exclusivity. And while a sense of justice is enough to move many organizations to act, there is also a strong business case. Offering a user experience that caters to only a select group of users alienates potential brand advocates and carries serious legal risks. According to the Americans with Disabilities Act Title III Regulations, public properties, including public websites, have to adhere to accessibility parameters. In other words, your digital presence must be designed and coded so that people can carry out their desired tasks, from completing a form to making a purchase. This should extend beyond your basic site to search tools, mobile apps, and social media. Organizations often find themselves on the wrong side of this issue. Rather than proactively carving out a path to invite accessibility in as a priority, they are reacting to negative feedback and even lawsuits. These companies are well-meaning but don’t know what to do to ensure compliance. To help keep you up to speed, here are six ways accessibility will impact businesses and website design in the near future: 1. Lawsuits will escalate From 2017 to 2018, the number of website accessibility lawsuits filed in federal court under Title III of the ADA shot up from 814 to 2,258. This trend will likely continue as more users hold noncompliant websites and other digital tools accountable. Recent high-profile lawsuits have called out Winn-Dixie, Beyoncé, Burger King, Rolex Watch, and Amazon. In a particularly unsavory 2019 story, instead of fixing its online ordering feature, Domino’s pizza responded to a blind customer’s lawsuit with a petition to the U.S. Supreme Court to quash the case. 2. Standard design processes will change To achieve a more inclusive user experience, designers and developers follow the Web Content Accessibility Guidelines (WCAG) for web standards. It’s part of their process, showing up as captions on videos for people with impaired hearing or spoken versions of site copy read aloud by screen readers. Before accessibility can become an intrinsic part of the website design process, organizations will have to rally their troops and emphasize its importance. In the near future, writing alt-text for images, ensuring all content can be accessed with a keyboard, and making sure text can be viewed at 200% without impairing readability will be second nature. While we aren’t there yet, the additional steps needed to build a compliant site will become standard procedure over the next few years. 3. Know-how will develop fast Remember the 1990s when accessibility ramps were tacked onto commercial buildings like ugly metal afterthoughts, function-rich but design-poor? Today, ramps are architectural features, such as switchbacks crisscrossing wide flights of stairs, and curving slopes that add to a structure’s beauty. In digital, we aren’t working with hammers and drills, and we’ve had 29 years to appreciate the precepts of the ADA. The speed at which website accessibility can and should evolve will be much faster than its brick-and-mortar counterparts. Plus, adherence helps companies compete more effectively for the more than $645 billion of disposable annual income that Americans with disabilities control, creating an additional layer of urgency. 4. Site facelifts will facilitate compliance Organizations regularly upgrade their websites and apps to make them faster, more secure or better optimized for search. Add accessibility to that list. When talking to our clients about site improvements, accessibility is at the forefront of conversations about website facelifts. These are often great opportunities to ramp up (pun intended) inclusion efforts. 5. Someone will own accessibility Who on your team will lead an initiative around accessibility? How will this person develop knowledge and implement more stringent ADA accessibility user testing? Is this a role for design/development or someone in HR/legal? More organizations are asking themselves these questions, and many are looking to outside partners to help them get and stay compliant. Whether it’s handled internally or externally, accessibility will become part of someone’s job description. 6. Audits will head off future legal fights ADA lawsuits and subsequent news stories can burn through an organization’s brand equity, repel customers, and rack up hefty legal expenses. When performed by a trusted digital partner, an audit can bring to light web accessibility infractions so that you can deal with them before they impact your audience. (One caveat: beware of predatory auditors. Scammers have been known to offer auditing services and then threaten to expose noncompliant clients to the ADA if they don’t sign on for follow-up projects.) As organizations take first steps to prioritize accessibility, initial results might look and feel like that ugly ramp from the 1990s. At Fusion Alliance, the growing pains have been worth it to ensure that our user experience offers everyone the same level of respect and compassion. Making accessibility part of your digital design conversations now will better serve every human in the future. Don’t wait to talk to your team. And if you need help, don’t hesitate to reach out.
Don’t leave remote collaboration to chance: Design it to be effective
How can we cultivate a culture of connectivity while remote? Have you ever felt the energy of working on a team where everyone was in sync and got things done? Or have you experienced the opposite, team collaboration that you dread, where no one agrees on anything and you just go in circles day after day? Collaborating as a team shouldn’t be left to luck. Instead of showing up and hoping for the best, why not design how you collaborate? Imagine getting everyone on the same page, setting guidelines, earning each other’s trust, and hitting or surpassing your goals. Those are the ideas that went through my mind as I developed an in-person Designing Collaboration workshop for the 2020 Indy Design Week (IDW) (a gathering of designers across Indianapolis). But when the conference pivoted into a virtual event due to the pandemic, I had to pivot too. Luckily, my Fusion Alliance team and I are no strangers to running virtual workshops, so it wasn’t difficult to transform my in-person Designing Collaboration workshop into the virtual Designing Remote Collaboration workshop. As I shifted focus to the remote format, I realized designing virtual collaboration isn’t just relevant, it’s necessary for success, especially today. This article will give you a quick walk through my workshop so you can see how bringing design thinking mindsets and methods with purpose into your remote team can improve collaboration. Unpack the common challenges of working as team Our workshop began by getting all participants to think about of the challenges involved in working as a team. Take a moment to think about what issues your team faces. See if any overlap with the ones we identified in the workshop: Members have different values or points of view Endless discussions without action Struggling to understand one another and align on next steps Domination by the loudest voices Review the mindsets of design thinking We shared the following mindsets of design thinking in our workshop. Just think of the impact on your organization if these concepts were to become practice. Being judgment free to encourage psychological safety Valuing getting started over being right to avoid analysis paralysis Allowing work to be time boxed and unfinished Working together, but alone so everyone’s voice is heard Making things tangible instead of discussing so that there is a common reference Having a beginner’s mind to overcome our internal limiters Apply design thinking methods to remote teamwork For the collaborative portion of our workshop, teams focused on redesigning how remote teamwork works in their organizations. We accomplished this challenge using a purposefully designed recipe of design thinking methods. Icebreakers help bring order IDW volunteers Moti Saleminik and Blake Coats assigned participants into Zoom breakout rooms, and using Miro, an online collaboration tool, they began an icebreaker activity. While icebreakers can be fun, time is valuable, so our activity was designed for attendees to both get to know one another and to assign an order in which participants would speak to avoid stumbling over each other when it was time to share. The 2×2 matrix revealed that those in the session were overwhelmingly dog lovers! Visually capture where to focus A Speedboat exercise, a method for visually capturing anchors that are preventing progress, was then used for groups to note what about remote collaboration holds their organizations back. What about remote collaboration holds your team back? After a vote, our workshop groups had identified the most important challenge they felt should be addressed first. The top challenges across the groups fell into the following categories. Adopting new technology Virtual (Zoom) fatigue Disjointed virtual relationships Handling distractions New norms Brainstorm potential solutions Groups were then asked to turn their top challenge into a “how might we” statement starter in order to kick off a brainstorming activity. Brainstorming is most effective when using a structured ideation format such as the Creative Matrix, a grid for generating ideas at the intersection of topics. Using the Creative Matrix, participants generated nearly 60 ideas in just 15 minutes. A Heatmap vote exposed which ideas were most inspiring. These included: Rating the focus levels of participants to gauge whether a meeting should take place or not Aligning meetings with participants preferred times Intentionally building into meetings breaks that release tension Customizing views and features of meeting tools to express unspoken social cues and prevent distraction Take virtual collaboration offline With the next activity, Crazy-8’s, I took a risk by asking participants to work offline and then post their work to the Miro board. Changing the medium seemed like it could cause issues, but by using pen and paper, each participant quickly sketched variations of their ideas without having to learn a new tool. The sharing and discussion that ensued allowed groups to build on one another’s ideas, which included concepts such as: Rating the focus levels of a team to optimize collaboration times Tracking the time meeting attendees spent actively participating in a meeting Smart calendaring to ensure an individual’s most productive times are uninterrupted by meetings Enhancements to meeting tools, such as blurring video when inactive and emojis to virtually read the room Commit to action The last step prompted groups to reflect on their experience and note what types of actions they would commit to going forward. This is what they came up with: Optimistic about Awareness of how emotional and mental health impacts teamwork Breaking down the barriers of location Learning new tools and approaches to collaboration Worried about How easily people can disengage or be distracted when remote The speed with which teams can adopt new tools and techniques Team members’ ability to adapt to this new way of working Learning about Emotional intelligence and social behavior Focus and our environments Additional best practices for collaboration Committed to Greater planning and communication of meeting goals Being aware of and establishing boundaries while working remotely Implementing collaboration techniques Summary: The benefits of designing collaboration We wrapped up our time together by reviewing how intentionally designing collaboration can make teamwork more effective by: Building empathy and trust with teammates Continually iterating towards a goal Aligning around clear, tangible artifacts Engaging every team member Now that you’ve walked the journey with us, take these lessons and apply them to your own collaboration. Final thoughts I want to say thank you to the IDW event organizers who put on an amazing virtual program. They did a fantastic job of adapting the format and worked with all of the presenters and facilitators to ensure we had what we needed to provide participants with an experience that continued to deliver on the theme of Cultivating Connections. I also want to thank the workshop participants who spent their time with me. They dove right in, demonstrating the value of designing remote collaboration. It was a blast to facilitate this workshop, and remember, as Helen Keller said, “Alone we can do so little; together we can do so much.” Interested in working with me? At Fusion Alliance, we’re ready to support your digital transformation, innovation, and experience design initiatives with custom-designed workshops or consulting engagements. Learn more about workshops by Fusion Alliance. Want to see our virtual workspace? Looking for additional resources? Visit Design+Strategy Academy and let me know what you think.
Automated UI tests: Taking the plunge
Curious about automated user-interface-level (UI) testing? That’s good, curiosity is where it all begins, and you’ve come to the right place. The next step can be the most daunting. The purpose of this post is to provide some high-level strategies and encouragement to get you started on your journey. Let’s get a couple of things on the table to avoid potential confusion. First, our central focus will be on automated UI-level tests. Some of the concepts and ideas will naturally bleed over into code-level unit tests and service-level integration tests, and we’ll discuss these aspects of testing with UI features in mind. Second, these high-level ideas come from our own experience and do not always translate to your unique business processes, operations, and technology needs. As Kaner, Bach, and Pettichord reiterate in Lessons learned in Software Testing, “. . . the concept of best practices doesn’t really exist devoid of context.” Now that you know what you’re in for, we hope that you’ll find conversation starters, thought provokers, or otherwise-useful nuggets to kickstart your transformation into automated UI testing. Taking the plunge: Start with expectations and create a baseline What do you hope to get out of an automated UI effort? Who is going to be writing the tests? How frequently do you envision them running? Who is going to consume the output and reports generated by them? The goal here isn’t to have an exhaustive plan or answer all of these important questions right away. When you’re in the process of undertaking new business practices or adopting transformative technology, it is important to have some sort of starting point or baseline to compare subsequent changes. Committing thoughts and ideas along with notations of your business’ current testing environment to paper (physical or digital) can serve as that baseline. Which technology/tool am I supposed to use? It’s best to approach this question with an openness to all the different ‘tech flavors’ and be unafraid to make significant changes. This will directly impact initial expectations, particularly with regard to the skills necessary to author the tests and any supporting code. It’s also important to think about your own SDLC as a whole in this stage: are you considering transitioning to BDD? Is there already a solid process for deploying into which you need to mesh? How frequently are changes being pushed? Categorizing your options for testing technology will help answer some of these initial questions. Categorizing technology options SerenityBDD and Cucumber unlock the gherkin syntax for describing behaviors, but require coded hooks in order to become executable. Selenium WebDriver and Appium open the door to controlling browsers, mobile devices, and desktop apps with the most modern programming languages, but requires a unit testing framework in your language of choice to write the tests. Record and playback tools, such as Katalon and TestComplete, boast “codeless solutions,” although you may end up in a situation where you are constantly re-recording scenarios depending on the app under test and the release cycle. This is by no means an extensive list of everything out there. As you stumble across others in your research, categorize additional options with those mentioned here. I’ve picked a technology and someone to work with. Now what? One of the most common mistakes we see is a lack of support to generate an enormous number of tests to convert a manual regression completely over to an automated one. Beyond the strong reminder that automated tests are not a complete substitute for manual tests, this can lead to a casserole of difficult-to-maintain artifacts that are constantly breaking the build. Take it slowly. Work through those questions in the sections above within the scope of just a few tests. You will thank yourself in the long run if you’ve dealt with some of the pain points with a limited scope before trying to ramp up the volume. For example, if working with a web app, start with simple navigation tests, i.e., confirm that you can navigate to three different pages, including the homepage by checking for page titles. Keep these tests as current as possible while changes to the application are in progress. Focus on how and when you run these tests. Consider how you might add more tests. If the thought of more tests seems too painful, consider the alternative of breaking down the process of conversion to even smaller steps. Useful example of taking the plunge into automated UI tests Let’s say that you’re a test manager at a company that builds technology solutions for healthcare providers. You’ve decided that you want to start experimenting with automated UI testing for one of the six different web apps currently under your purview. After considering the makeup of the whole team responsible for that app (BAs, scrum masters, application developers, testers, etc.), you’ve settled on the enterprising individual who will give this a shot. You discuss the current development and deployment processes, and, given the background of the project in question, decide that a Java project built on the command line best suits your technology and business process needs. After working through Selenium WebDriver tutorials, your test writer comes back to us with a project containing two tests: one that confirms that upon navigating to the homepage URL, the page title is accurate, and another that confirms the page title of the login page. Over the next few weeks, you focus on running these two tests frequently, ironing out your own build process, and working with the application development processes to determine when and how our tests execute. We also work through a couple of different reporting methods while figuring out how to present, discuss, and store that data. When ready, we expand our two tests to ten and (once again) iterate on our processes and goals. We continue this cycle until we’ve got solid coverage with reliable tests and processes. Armed with our experience from the first app, we turn our attention to the next one. Follow-up post: Automated UI Tests: Taming the Tangle
8 steps to speed up product design and development
Product design (UX/UI design) is becoming one of the most important roles in the tech industry. Designers are under pressure to accelerate product development and reduce the time, effort, and cost spent. We’ve been there and understand what it’s like. This eight-step process can help you speed up development and achieve all of the above. Use it to understand your product goals and customers, and also to collaborate with the entire team to discover problems, ideate, test, and validate potential solutions. 1: Understand the product and set up a strategy Familiarize yourself with product vision and strategy Your first step is to understand the product’s “big picture” and the vision behind it. You’ll need to answer the following questions: What problem the product is solving? What value is it delivering? Who are the users? Who are the competitors? Who are the partners? Conduct stakeholder interviews In order to answer the questions above, you’ll need to meet with project stakeholders. In your interviews with them, ask for the following: Mission statements, strategy documents, organizational or team structure charts, etc. KPIs (key performance indicators) − to help you understand the most important features in the product by understanding what success looks like. Previous research they’ve conducted − including user research, market research, competitor analysis, etc. Create a roadmap Roadmaps facilitate team collaboration and clarity around priorities. Create a roadmap to help your team better understand: What is the ideal state of the product? What is the current state of the product? What steps need to be taken to meet the end goal and how should you prioritize them 2: Conduct user research User research is one of the most important steps in the product-design process. All of your team’s hard work, time, and money will be worthless if you end up making a product that no one wants to use or that can’t compete in the market. Here are some research methods to help you better understand your users and competitors: Establish user personas: A persona is a hypothetical character created to represent a major user group that might use your product in a similar way. Create user personas to better understand your product’s users and their needs, goals, and pain points. To create user personas, use the data you gathered in stakeholder interviews, conduct surveys, interviews, ethnographic research, etc. Create a user journey map: A user journey is the path a user takes through your product to achieve a certain goal. User journey maps show users’ thoughts and feelings while using the product or going through that journey. This makes it easier for you to identify areas for improvement since you see when your users are annoyed, confused, or happy. Develop your user journey maps using the feedback received through user testing, observations, data received from the support team, etc. Conduct a competitive analysis: Conduct market research or a competitive analysis to learn what other similar products or companies are doing and analyze how their problem/solution could map to your own problems. 3: Define your information architecture Information architecture helps you organize and structure the content of your product in a way that your users can find what they are looking for easily without having to go in circles. Create this structure for your product through any of these methods: Site maps Flow charts Card sorts 4: Discover problems Discovery is an important phase that allows designers to work with the entire team to define and research problems identified in steps 1-3, as well as gather enough information and initial direction on what to do next. Discovery will help you frame problems with all the evidence you need before moving to the ideation phase. 5: Ideate The ideation phase moves you from learning about your users and the problem to coming up with potential solutions. In this phase, gather together and come up with as many ideas as possible. The focus is on quantity, not quality. Some ideas may surface as the potential solutions to your problem. Others will end up in the reject pile. If carried out properly, an ideation session can lead you to find that groundbreaking solution that you and your users are looking for. 6: Perform user testing User testing gives you the opportunity to evaluate and validate your ideas with the users. At this stage, you’ll be able to gain deep information about your users’ behavioral patterns, preferences, and suggestions. Testing early during the design process allows you to prevent future re-design costs and to launch a user-friendly product. 7: Finalize the design With the usability testing complete, you can start updating the design according to the feedback you received. You will now design what the screens will actually look like and create the final UI through high fidelity wireframing and prototyping. 8: Communicate and collaborate Your last step is to share the design with developers and walk them through the entire user flow to give them the opportunity to review what needs to be implanted and raise any questions or concerns. Once the development starts, you might need to do any of the following tasks: Support developers : Provide guidance and answer questions about how things should look or work. Update: If there are technical limitations with implanting the design or new issues arise, get more user feedback and update the designs again. Review and desk check : When the development is completed and pushed to the test environment, review the work to make sure everything matches with your design. The benefits If you follow this process, you’ll be able to develop products with: More efficiency: Time, effort, and cost will be reduced by discovering and testing different ideas early in the process and moving forward with the solution that works best for both customers and the business. Higher customer satisfaction: Continuous research helps you understand and respond to users’ needs so that you are more competitive in the market, which helps you increase customer satisfaction remarkably. Accelerated development: Providing high-fidelity design and working closely with developers throughout the process prevents them from making changes and fixes that are avoidable (such as when mocks are not detailed enough or when they make the wrong assumptions), resulting in faster development. This high-level overview is a great starting point, but every organization and product has different needs. If you’d like to talk about how to improve your current product development process or how to establish a new one, contact us today.
An overview of RPA: Is it right for your business?
There’s been a lot of hype about Robotic Process Automation (RPA). Headlines tell us we can transform our business process in as little as 12 weeks using RPA bots. Benefits are touted, velocity is promised, trends of growth are noted, and new jargon was coined: “Automation arbitrage, a term Gartner uses to describe the recalibration of human labor to drive business outcomes is one of the biggest enablers in this coming decade.” – Gartner, The CIO’s Guide to RPA and Introduction to Hyperautomation. Hype can be fun, but it doesn’t answer the very practical question, “Can RPA help transform my business?” This article will answer it for you and help you make an informed decision about whether RPA is right for your organization. Read on to learn the best-fit processes, work through a decision flowchart to determine whether your process is suitable for automation, and gather helpful considerations to keep in mind as you’re getting started. There are also links to demos and further resources throughout. First, what is RPA? Robotic process automation uses computer software (bots) to emulate a human worker interacting with digital systems. RPA bots automate repetitive tasks by interacting with software applications, just as humans do while working. In short, companies use RPA software to perform repetitive tasks that would usually be done by workers sitting at their computers. Bots can be programmed to work just like us − logging into and switching between applications, interpreting information, making calculations, and copying and pasting data. Bots can also process data, trigger responses, and communicate with other systems to perform tasks at a high speed without error, which enables organizations to effectively automate tasks, streamline processes, and increase productivity. Often conflated with artificial intelligence (AI), RPA is non-intrusive and does not require system integration. It sits on top of your existing system to perform business processes, using the same interfaces that humans use. And, unlike scripts or macros, RPA will not break every time there is a minor software update. Essentially, bots can work in two modes, attended or unattended. This provides flexibility to better meet specific business needs. Attended bots are typically targeted toward front-office activities and are useful when the entire end-to-end process can’t be automated. These bots are programmed to work alongside humans to complete processes that can pass data between bots, applications, and human workers, or complete specific functions within a process. Unattended RPA bots execute tasks and interact with applications independent of human involvement. Unattended bots can be triggered by events or scheduled. They will run until a condition is met. Read more: Jumpstart your business processes: Using hyperautomation to achieve speed and scale >> Which business processes are the best fit for RPA? An important thing to understand about RPA is that it doesn’t add value to every area of the enterprise. Forrester Research, Inc. counsels caution when considering this “shiny new kid on the block.” Therefore, careful consideration, selection, planning, and governance are crucial to the success of an RPA implementation. First, consider whether your process falls within or is similar to this sample list of business functions that benefit most from RPA: Finance and accounting − orders, claims, vendor management, accounts payable, and collections. IT services − software deployment, server and app monitoring, routine maintenance and distribution, batch processing, password reset/unlock, backup and restoration. HR services − data entry, payroll, time and attendance management, benefits administration, compliance, and reporting. Supply chain − inventory management, demand and supply planning, work order management, and returns processing. Next, ask yourself the following questions about the process you hope to automate (see the decision flowchart below for the entire process): Is it rules-based, standardized, with clear processing instructions or templates? Is it highly manual, repetitive, and prone to human error? Do transactions flow at a high volume and/or frequency? Is it well-documented, stable, and mature? Are there standard, readable electronic input types? The above sections represent the critical first step to determine whether automation is right for your process. Completing the exercise of the decision flowchart will make it clear whether you should pursue business transformation via process improvement initiatives, or RPA implementation. Benefits realized from RPA There is a reason automation is here to stay, and the sooner you implement RPA, the sooner you create a competitive edge for your business. RPA benefits include: Reduced costs − RPA can reduce processing costs by up to 80%. Improved economics, efficiency, and effectiveness through reduction of human error and the costs of duplicate effort, rework, and mistakes. Transformed and streamlined organization workflows. Increased compliance and consistency. Positive impact on operational metrics − reduced focus on non-value-add activities provides time for important strategic tasks and customer relationships. Improved customer service through agent access to readily available information and reduced manual efforts. Non-intrusive, seamless integration with existing enterprise systems, resulting in reduced implementation costs. Extremely scalable across business units and geographies; multiply bots and deploy more as you go. Improved processes − bots constantly report on their progress, so you can strategically improve processes by using operational and business predictability. How to ensure RPA implementation success Clear vision, comprehensive planning, and structured governance are critical factors in the success of any RPA implementation. Proposed changes must be well-defined by leadership, shared by IT and business, and communicated with the affected employees. Below is a list of success factors to keep in mind as your organization takes its first steps toward automation. Plan well − a common automation pitfall is lack of governance. RPA programs need centralized control and governance, including formalized methods and standards to ensure maximum benefits. Avoid working in silos − implementation efforts must be driven by collaboration between IT and the business and based on a clear vision from leadership. Start managing change early − your people strategy can’t be put off until deployment. The successful realization of benefits from RPA projects requires end-to-end organizational change management (OCM) that is adaptable to the size and complexity of the RPA endeavor. Communicate widely and frequently − throughout the implementation process, communication is key because bots will change how people do their jobs. In addition, some workers may fear job loss, so communication, transparency, and training can help them embrace this new frontier in business processes. Manage for, or eliminate, potential surprises − don’t forget to factor in the effects of third-party partnerships and applications. These are an uncontrollable factor of your business environment, so use care when including them in your automated process. Put process over tools − RPA is not only about rapidly developing bots. A robust governance structure, well-defined opportunity identification process, quality development, and reliable operations are more important than any particular tool. Stay objective − avoid implementing automation solely for the wow factor” Be sure you understand what you hope to achieve through automation and that you’ve considered the long-term costs involved. Manage expectations − bots are not the whole solution, and RPA is not a silver bullet; it should be viewed as only part of the automation strategy for the enterprise. You may need a broader strategy, such as system modernization, process transformation, and use of machine learning, to underpin a larger transformation effort. Keep in mind that RPA is not a set-and-forge” process. New bots will need consistent oversight until they are fully trained. They’ll also require ongoing management, especially when there are changes in the system or environment. Summary Now that you have the facts, you can decide whether RPA is right for your organization. And if you need help, our team is experienced in leading Robotic Process Automation programs in both advisory and implementation capacities. Our solid partnerships with Microsoft and UiPath, a top RPA vendor according to the 2020 Gartner RPA Magic Quadrant, help us offer the most appropriate technologies available for your organization’s needs. RPA services that we offer include: Advisory/assessment Set up Center of Excellence (prioritization of applications) Construct team Evaluate tools Evaluate Book of Work (project work having funding associated with it) Process mapping and analysis Implementation Develop and upgrade bots Create run books for bots (procedures for handling tasks, contingencies, and troubleshooting) Perform monitoring and management (steady state) Contact Fusion Alliance to discover if RPA is right for you.
The future of retail: Conversational marketing and machine learning
This article was originally published at the Forbes Communication Council Amazon has generally been considered the standard-bearer for product recommendations, and for good reason. The retail giant utilizes user data on past purchases, browsed-for items, and even what users have recommended to others to generate recommendations. Just think, recommendations likely popped up in the sidebar during your most recent Amazon binge. “People who viewed this product also viewed...” often appears as you scroll. A chatbot may even appear with ideas related to your shopping history. This is conversational marketing at work. Still, these advancements fall short of creating a truly personal experience that can predict and assist buying behavior by having a full view of who the person is — not just their recent search and purchase history. Even common segmentation methods fall short by making assumptions based on age and gender that fail to account for many outlying factors that can be easily discovered. The future of retail will be defined by immersive, conversational experiences that lead to better customer interactions and increased buyer loyalty from brands that stay ahead of the curve. While conversational marketing has taken many companies this far, using conventional conversational marketing techniques, in conjunction with machine learning, can be the answer retailers are looking to create the experience of the future. Online retail: Blending conversational marketing with AI technologies While conversational marketing has become the trend in business-to-business (B2B) demand generation strategy, there exists a huge business-to-consumer (B2C) opportunity, as well. Conversational marketing practices utilize website chat features and chatbots to initiate in-the-moment interactions with customers and build context to quickly qualify them for the appropriate next step. According to David Cancel’s aptly titled book, Conversational Marketing, both baby boomers and millennials are likely to adopt the use of chatbots, with a majority in both groups finding instantaneous responses and quick answers to simple questions being potential benefits. Aside from the obvious advantage of getting answers to product questions, automated chatbots offer a number of opportunities to enhance shopping experiences when coupled with data. Machine learning and chatbots While many B2C companies are already leveraging chatbots to streamline the customer experience, there lies even greater opportunity with machine learning to truly learn from and predict consumer behavior. Today’s practical machine learning models enable rapid iteration of data and deliver quick, reliable data sets. Data collected from customer conversations about the products they research, buy, and use can tell a deeper story about the customer themselves over time. Instead of a static list of recommended products based on their last purchase, machine learning can help us understand the customer’s lifestyle and habits in such a way as to help the customer make the best purchase in the moment. As an example, imagine an on-the-go, seasoned business professional with a love of podcasts and streaming music. Our traveling audiophile is a regular adopter of new headphone technology and is on the hunt for a new pair. While segmented data and previous purchase history might be able to get us in the ballpark when it comes to their next tech purchase, it doesn’t tell the whole story. In fact, the reason for this purchase has nothing to do with a search for the latest technology, but rather because past purchases have missed the mark for this customer’s need for multitasking and call connectivity. In this case, relying on past purchase history or even peer purchasing information won’t help. However, their experience with a chatbot powered by machine learning can give us helpful predictive data that informs the retailer of their need for a balance between audio quality and the ability to quickly and clearly connect to meetings during travel. A few quick questions allow the chatbot to suggest a new pair of headphones to fit their lifestyle, along with helpful content and reviews that match our customer’s pre-purchase research habits. Marrying predictive data to emerging technologies As advances in artificial intelligence (AI) continue to blur the line between human and bot, and retail brands continue to experiment with augmented reality (AR) to replicate brick-and-mortar shopping experiences, it’s vital that data plays a role in the next phase of online shopping. Not only should brands be placing an emphasis on the aesthetic experience that can be delivered through apps infused with AR, but they should also make room for predictive machine learning data to make the buying process even easier for the consumer, making them more likely to return in the future. In fact, for any brand wishing to be at the forefront of the next wave of retail evolution, I believe it’s vital that a data governance framework be in place and actively funnel information to teams developing emerging technology. The days of keeping customer data siloed away from our product teams need to come to an end in order to fully realize the marketplace potential. The future of retail is filled with possibilities that can completely reshape the way we understand consumer behavior and connect with the consumer to meet their needs in real time. Taking tangible steps to listen to our customers, learn from them, and act to predict their needs, while delivering a stellar shopping experience along the way, is more than a possibility — it’s a reality. At Fusion Alliance, we find our place at the intersection of advanced analytics, experience design, and technology, leveraging machine learning to gain customer insights that inform our strategies. Learn more about our approach to machine learning solutions >>
A winning data management and analytics strategy
Digital natives like Uber and Lyft have all but changed the face of the taxi industry and the customer experience. Many are huge fans of these companies, and it’s no wonder. You don’t have to flag anyone down or have an awkward street scuffle to get a ride. Uber has skipped the web and gone straight to the mobile device as its target platform for order management and fulfillment. Every customer transaction goes into Uber’s database: name, address, credit card info, cell number, pickup location, drop-off location, where you travel and when — the list goes on. But it’s not just the ride that is valuable to Uber. Their database is where the real value is. And guess what? Uber was recently valued at $49 billion. Every company has data. And every company needs a data strategy to take advantage of its value. Some steps to create a winning strategy are: 1. Make data management and analytics a priority Increasingly, business and personal transactions and interactions are going through digital channels. As they move to digital channels, they leave behind a lot of data that was not available to companies before. New sources of data exist everywhere — social media, geolocation data, etc. There’s just a lot more digital content available now to help a business understand its performance, relationships, and reputation. But you need to know what to do with it. For example, as Uber’s database grows, it becomes more valuable. They can look at their customers’ digital footprint through analytics and, over time, they can see distinct types of users emerge from their travel and interaction patterns. They can then use these analytics to expand and refine their service offering to better serve the needs of users and travelers with similar digital footprints. 2. Overcome IT challenges There are many new choices of data technologies. You need to figure out how to incorporate them into your company’s existing technology stack. But even before that, you need to understand how to manage data as an asset and consider: How data is governed Who owns the data How to manage data quality and security How to handle demand management as new data requests come in and new data sources are identified In addition, you need to understand how it will be integrated into the infrastructure and environment. That said, it’s hard to find the people right now who understand all of these technologies. There is a shortage of data scientists and Hadoop engineers, for example. Having access to the resources with the skills to implement and manage these new technologies can be one of your biggest constraints and barriers. Legacy systems vs. open source There are also challenges associated with the whole technology space. Legacy vendors, such as Oracle, Teradata, and Microsoft, want to maintain their hold. They’re all fighting to remain relevant in a technology market space where open source is creating more compelling and cost-effective solutions for businesses. Microsoft, which has a huge research component, has had difficulty embracing the open source movement in the past. Today they are in full support of open source projects in Azure and Visual Studio and release many of their own code bases, such as .NET. Open source is actually much more valuable because it’s run by people who are constantly working on issues of security or lack of performance. These folks will immediately address your issues for one overriding reason, they are passionate about code — Wikipedia all over again! 3. Prepare for organizational change We’re not just dealing with our own transactional data anymore. We’re dealing with data from our industry or sector, as well as external data, such as weather. Though weather might not seem like it has anything to do with your business, weather data can provide great insight that can enable you to positively impact business. Many other datasets are also available, some for a fee, which organizations have discovered they must pay attention to, in addition to their own operational data. The technologies of today’s data management agenda are new and emerging and are not technologies for which IT traditionally has the skills. There’s a big divide between IT capability and what the business demands for integrating and managing data. As a result, roles like data scientist and data analyst, with the kinds of necessary skills, are not yet common within organizations, making organizational and change management a requirement. 4. Embrace the role of a CDO Until recently, we’ve pretended as if the people who are responsible for the technology (the wires, pliers, software, and ERP systems) actually care about the data, but they don’t. A CIO is not the best person to manage the data. There’s a new paradigm out there, the chief data officer or CDO. Data is such a critical corporate asset that it needs to be managed strategically and at the executive level outside of IT. Technology is an enabler, but data is an asset. Currently many account for them in the exact opposite paradigm. Many organizations are now appointing a CDO, reporting to the COO or CEO, and their role is to oversee and manage quality, integrity, and use of the organization’s data assets, just like the CFO governs the organization’s financial data. Start implementing a winning data strategy today The elements covered here will get your business off to a strategic start toward more effective management of your data and analytics. If you want to be successful, remain open to new ideas, get help from outside, and embrace new paradigms for how your business should interact with data as it continues to evolve. It’s critical to understand that you should absolutely take advantage of anything that can accelerate driving insights from the data already just sitting in your systems out into the marketplace. Having a strategic partner who brings the required expertise and ability to implement proven methodologies will enable your company to create successful data capabilities, and you’ll be able to groom and train internal resources at the same time. That’s what’s going to enable you to beat your competition.
6 steps for planning your big data strategy
Internet users produce an estimated 2.5 quintillion bytes of data each day. Yes, that’s quintillion — as in a one followed by 18 zeroes. That’s a mind-boggling amount of data. Yet, every day, that information is mined, analyzed, and leveraged into usable insights that businesses then use to streamline operations, assess risks, track trends, reach a specific target audience, and so much more. Big data, the term we use to describe this vast amount of information, is a goldmine for industries seeking to increase revenue and improve operations. But without a solid strategy for how to use that data, you could scour the internet until the end of time and still not see any gains. Before you dive in to the big datasphere, it’s best to familiarize yourself with what a big data strategy looks like. Then, you can take measured steps to ensure your vision is properly focused and ready to deliver the value you need. What is a big data strategy? A big data strategy is exactly what it sounds like: a roadmap for gathering, analyzing, and using relevant industry data. Regardless of business vertical, an ideal big data strategy will be: Targeted. You can’t hit a moving target, let alone one that’s too nebulous to define. Drill down to the details until stakeholders are aligned on the business objectives they want to reach through your big data strategy. Actionable. Data can be insightful without necessarily being actionable. If your big data strategy doesn’t serve up information usable by the broader team while paving the way for next steps, it likely won’t be beneficial in the long run. Measurable. As with any other business plan, a big data strategy needs to be measurable to deliver lasting success. By measuring your incremental progress, you can refine your strategy along the way to ensure you’re gathering what you need and assessing it in a way that serves your goals. What’s the best way to approach a big data strategy? Now that we’ve covered the basics of what a successful big data strategy entails, let’s turn to how your organization might put one into practice. As we’ve worked with clients across industries, we’ve seen the following six steps deliver wins. Your big data strategy will likely require unique details, but this action plan gives you a starting point. 1. Gather a multi-disciplinary team Big data is not solely an IT project; it’s a business initiative. The team should have more representatives from business departments than from the corporate technology group. Members typically include knowledgeable staff or managers from finance, business development, operations, manufacturing, distribution, marketing, and IT. The team members should be familiar with current reports from operational and business intelligence systems. A common thread? Each team member brings ideas about performance indicators, trend analysis, and data elements that would be helpful to their work but which they don’t already access. More importantly, they know why having that information readily available would add value — not only for their business units, but for the organization as a whole. 2. Define the problem and the objectives What problem should be analyzed? What do you hope to achieve through your strategy? Take three problems you’d like to have solved and formulate them into questions. Limit yourself to three, to start. There will always be more questions to answer. Don’t try to tackle them all at once. Write those questions as the subject line on three emails. Send them to all members of the multidisciplinary team. The replies will guide your efforts in narrowing (or expanding) the initial scope of study. Here are a few questions to get the ball rolling: What do you want to know (about your audience, your processes, your revenue streams, etc.)? Which factors are most important for increasing margin on a given service or product? How much does social media reflect recent activity in your business? Which outcomes do you want to predict? Developing a 360-degree view of all customers in an enterprise may be too ambitious for an initial project. But finding the characteristics of commercial customers who have bought products from multiple lines of business in five key geographic markets might be a more manageable scope right out of the gate. With this approach, iterations in development provide expansion to all lines of business or to all markets in cadence with a company’s business pace. 3. Identify internal data sources Before getting into the technical weeds, you need to know what data exists internally from a functional viewpoint. Gap analysis will uncover incomplete data, and profiling will expose data quality issues. Your first step is just to identify what usable data you have. If customers for one line of business are housed in an aging CRM, and customers for a newer line of business are found in a modern system, a cross-selling opportunity analysis will point out the need to integrate those data sources. Do you have an inventory of data sources written in business language? In forming a strategy, a team will want to have references, such as vendor contracts, customer list, prospect list, vehicle inventory, AR/AP/GL, locations, and other terms that describe the purpose or system from which the data is derived. The list can be expanded for technologists later. Learn how to develop data as an asset >> 4. Find relevant external data sources If you don’t have enough data internally to answer your questions, external data sources can augment what you do have. Public data sites like Data.gov, the U.S. Census Bureau, and the Department of Labor Statistics’ Consumer Price Index have a vast amount of information available to anyone who can operate a search function. Data.gov alone has over 100,000 datasets, some containing millions of rows covering years and decades. Social media is another invaluable source of data. Regardless of industry, Twitter, Facebook, and Pinterest posts may have a greater impact on your operation than you realize. Be sure that a couple of members of the team pursue data from social media sources to include in the initial study. 5. Develop an organizational system One of the most important elements of a big data strategy is organizing the data you collect. Whether it’s analytics dashboards or full-blown data fabric systems, you’ll need a way to organize data in order to analyze it. Decide how and where you want the data to live, how it can be accessed, and who will have access to it. Remember that the more you democratize data, the more your team grows comfortable with reading and handling this information, and the more insight you can glean. However, this also means you’ll need a strong system of management to ensure the data is secure. 6. Get experienced guidance Engaging an experienced team that has led others through data strategy and implementation can help you jump-start your strategy. An external resource skilled in big data management can provide your company with a smooth progression through the many tasks at hand. Your guide should have extensive knowledge of business data elements, or BDEs, which are key to creating understandable and cross-company analytical outputs, including reports, charts, graphs, indicators, and other visualizations. Seek guidance especially if your organization doesn’t have a data glossary, network administration, or knowledge of new technologies, as implementing these can be highly technical and time-consuming. Planning your big data strategy Planning a big data strategy will require you to rethink the way you manage, operate, and analyze your business. But with the right guidance and tools you can develop an effective strategy that positions your company for growth and success. Need a guide on the path to creating your big data strategy? We’re here to help. Reach out to an expert to learn more about how you can leverage big data for your business. Discover our strategic data management services >>
Why Microsoft Dynamics 365 is a gamechanger
If you have a legacy ERP or you’re considering an ERP, you may be considering Microsoft Dynamics 365. We believe it is truly a gamechanger, so we put together this evaluation of the platform so you’re well prepared to decide if it’s right for you. A little background A new day dawned in the ERP market when Microsoft released the new Microsoft Dynamics 365, a comprehensive software suite that facilitates the fusion of CRM and ERP cloud services focused on a specific business process in late 2016. This integrated, cloud-based CRM and ERP solution offers intelligent Software as a Service (SaaS) applications that seamlessly integrate with Microsoft Office 365 and other Microsoft cloud-based technologies such as Flow, Power BI, Power Apps, and Cortana Machine Learning. MS Dynamics CRM 365 incorporates leading-edge, module-based business processes in the Field Service, Sales, Project Service Automation, Customer Service, Marketing, Financials, and Operations suites. The Operations suite incorporates advanced ERP functionality such as manufacturing, HR, project accounting, supply chain management, procurement and sourcing, retail operations, point of sale, and e-commerce. Now you have a technical explanation of functionality, but why should you dig in and understand more? One huge reason: Dynamics 365 is a game-changer in the ERP world. The difference between Dynamics AX and Dynamics 365 Dynamics 365 represents an entirely new model for delivering a combined ERP and CRM solution in a way that aligns with today’s world of digital transformation. More importantly, it helps your organization more rapidly and cost-effectively fit into this new world order, competing at an accelerated level. Takeaways include understanding: The tremendous cost savings that can be realized The competitive value Dynamics 365 will bring to your organization (and how) The efficiency gained through seamless integration with apps How the benefits of cloud services can impact your business The old way of doing things Before Dynamics 365, the choice was to implement Dynamics AX and Dynamics CRM on-premises. Let’s look at an example of how this played out for a fictitious company with $1 billion to $2 billion in annual revenue. From a project perspective, it took an average business three to six months to get the ERP and CRM installed and ready to set up the company for configuration. The typical cost was: $150,000 to $250,000 on hardware and the virtualization platform (most often VMware VCloud or Hyper-V) $250,000 to $500,000 in server OS, database servers, application servers, SharePoint server, and ERP licensing $200,000 to $400,00 in professional consulting services Just to get to the setup stage, the company had already invested a significant amount of time and money, yet not one ounce of business value had been delivered − because the work wasn’t done. The remainder of the project took around 1 to 1.5 years to configure, customize, train, and migrate the data. At the end of the project, the business walked away with complete, end-to-end management of processes through the ERP, from financials to supply and inventory, to sales and delivery of product, which is a win. But those weren’t the only costs. The consulting professional service fees tacked on an additional $3 million to $10 million, depending on the modules, level of customization, system integrations, and reporting. All told, it took about 1.5 to 2 years and $3.5 million to $10 million in costs before rolling out the system into production and realizing the business value/ROI. Though the software delivered what was needed, it’s easy to see why that model no longer works in this age of transformation. Digital technology has enabled customers to demand and receive what they want when they want it − instant gratification. At a time when customers’ expectations are high, organizations must find ways to cut costs and do more with less in order to compete. Changing times call for a new model of delivering ERP solutions. Enter Dynamics 365. How Dynamics 365 works: a solution for today’s market The beauty of the new Dynamics 365 is in the accelerated pace you gain in getting a return on your investment. The installation and initial server configuration can be completed in one week because everything is hosted in the cloud as SaaS. No complicated hardware, server, and database installation or configuration nightmares to contend with. The speed to get an environment ready and the component-based agile implementation capabilities eliminate a tremendous amount of cost versus the traditional on-premises installation. Dynamics 365 is no longer just an ERP. It’s a cloud-based business platform that enables companies to quickly revitalize their business processes and enable a fast path to digital transformation. Dynamics 365 is Microsoft’s new approach to end-to-end intelligent business applications in the cloud. The benefits of Dynamics 365 Let’s break down what you get from Dynamics. The business applications enable companies to do the following. Start with what you need − Dynamics 365 apps are designed to be easily and independently deployed. That means you can start small with the right fit for your role, industry, and business. You pay only for what you need. Dynamics 365 apps work together seamlessly and fit with your existing systems, so as your business demands, you can grow into additional capabilities with ease and run your entire business in the trusted Microsoft cloud. Productivity where you need it – Deep integration between Dynamics 365 and Office 365 connects the structured workflow of business applications and processes with the unstructured work of collaboration and productivity. So your employees are empowered with productivity tools surfaced in the context of their business processes. For example, a salesperson receives an email and can respond directly in Office with a quote created based on information from both the Finance and Sales apps, stored back to the right app, with right pricing, discounting, etc. All this is done without the user leaving Outlook. Another example: a sales rep works on an opportunity in Dynamics 365 Sales with embedded access to multiple productivity tools to more effectively progress the opportunity. The rep can collaborate with team members by sharing real-time updates in Yammer and across groups, compiling team notes and input on the opportunity in OneNote, and seamlessly generating a quote in Word – all without leaving the context of the opportunity. Intelligence built-in – With Dynamics 365, Microsoft is the only provider of business applications that infuses big data, advanced analytics, and IoT into processes out of the box to proactively guide employees and customers to optimal outcomes with predictive insights, prescriptive advice, and actionable next steps. Data and insights transformed into action for intelligence where it is needed. For example, Cortana Intelligence will enable cross-sell recommendations to help sales reps predict which products and services a customer will need. Access to IoT data inside Dynamics 365 for Field Service will enable preemptive action from field service agents by connecting asset monitoring and anomaly detection so they can take action before failures occur avoiding costly customer service issues. Ready for growth – Dynamics 365 enables companies to adapt and innovate in real time with nimble, adaptable applications so you can compose, modify, and extend processes. Business users are empowered to change and adapt without IT. And you can reimagine your business model with a common data model and flexible, extensible business application platform. What Dynamics 365 does for businesses Why Dynamics 365 is a gamechanger The phrase “gamechanger” is thrown around loosely, but Dynamics 365 is truly just that. It reduces startup and licensing costs and accelerates the agile implementation of business processes, enabling only the modules you need when you need them. Plus, the possibilities gained from leveraging the extensibility of the Microsoft Azure Cloud platform will result in driving your business forward and creating value for your customers. In short, you will no longer waste time focusing on IT tasks and instead focus on positioning your IT investment to drive value faster than ever before. Looking for a next step? As a Microsoft Gold Partner, we specialize in Dynamics 365, Office 365, Power Apps (including Power BI, Power Automate & Power Virtual Agents), and Azure cloud services, covering everything from licensing and migration to ongoing support and infrastructure. Learn more about our partnership and Microsoft services here.
How to calculate the cost of a big data strategy
The future looks rosy for companies who take advantage of what strategic data management can do. But the specter of needing a team of people handling on-premises hardware and the cost implications of doing so continues to make organizations hesitant to move forward with a new data strategy. Here are a handful of factors to consider when weighing the costs versus benefits of implementing a big data strategy in your organization. 1. Compare the dollars and cents In 2012, I conducted a study that compared the cost of managing data with traditional data warehousing assets, such as Oracle, to the cost of managing that same data with an open-source software framework, such as Hadoop. At the end of the day, including a 60% discount off list price for the hardware and software licenses for Oracle, the cost to manage 1 terabyte in a 16 terabyte configuration with traditional assets was $26,000 per terabyte compared to $400 per terabyte with an open-source framework. 2. Analyze the total cost of ownership The reason there wasn’t a mass exodus in 2012 from Oracle to Hadoop was because you have to consider the total cost of ownership. You have to ask, “Does my organization have the skills to manage this new technology environment? Is my existing Business Objects universe investment compatible with the back end?” In 2012, the answer was no. Today, you can connect your existing Business Objects universe investment to Hadoop on the back end. Then, take all that data out of Oracle, expose it through HIVE tables where it can be accessed, and enable the environment to perform even faster than it can perform in Oracle for pennies on the dollar. Pennies! Why wouldn’t you do that? 3. Evaluate the competitive advantage It goes something like this, “Well, if my competitor is running their data warehouse for $4 million a year on a legacy technology stack, and I can ‘lift and shift’ my data warehouse to a technology stack that I can run for $40,000 a year, who’s going to gain a competitive advantage?” 4. Assess the value of a 360-degree view of your customer In the TV series, “How to Get Away with Murder,” a forensic analysis of a suspect’s cell phone data that was backed up to his computer is performed. The other data is provided by the telecom provider. Because of the GPS service on the suspect’s phone, the detectives were able to identify his entire route from one state to another, how much time he spent when he was in motion, how much time he spent when he stopped, when he started again, and how many minutes his phone was in a particular location. They were able to create a geospatial plot of his path, all using the data stream from his mobile phone as he was driving his car with his phone on his person. This brings us to another important point when we think about data today. We’re living in a world of mashups. There’s opportunity to go out and subscribe to a Twitter feed and mash that up with an email address linkage in a way that would identify my behavior and thought processes. All that lives in the Twitter space or in my Facebook posts can be analyzed. Mashing up these many sources of data into a mega-analytic platform capability has become something that is easy to accomplish, but not if you don’t have a strategy for how you’re going to manage the data. Sam Walton’s objective with his fledgling Walmart stores was to always know what the customer wanted to buy and always have it on the shelves when he or she walked into the store. Back in the 1980s, Walmart used Teradata technology to build a database to collect all of the point-of-sale data, which was then used to calculate how many units they would need to ship to each store so they wouldn’t have to carry a surplus of inventory. The rest is history. The database actually became much more valuable to Walmart than the inventory carrying costs problem they solved using it. And now Walmart is a half-trillion-dollar a year global company. 5. Gauge the payoff of higher-end analytics Amazon is another huge data success story. As you know, they started as an online bookseller and didn’t make much money selling books online. But what they were able to do is get consumers to go to their portal and interact and leave data behind. They were very successful in leveraging that data, and from that data, they have grown into a company with over $100 billion dollars in sales. And now, of course, Amazon sells everything. Amazon is using the highest-end analytics, called predictive analytics. In fact, they recently filed for a patent on an analytic model that can predict what you’re going to buy before you buy it. Predictive analytics tells them there’s a pretty good chance that you’re going to purchase a product in the next 24-48 hours. They’re so confident in the accuracy of their algorithm that they will ship you that product before you even buy it. Let’s say something from Amazon shows up on your doorstep that you didn’t order, but it’s something that you wanted. Then you’ll pay for it. This isn’t yet a production feature of amazon.com, but keep your eye on the bouncing ball! The future of big data strategies and strategic data management The future belongs to companies whose data game is completely integrated into the foundation of how they do business in the marketplace. And because companies like Amazon know so much more and their revenue is so diverse and their ability to manage data is so significant, they are now even in the data hosting and data enrichment services business. They are selling their data and hosting apps in an infrastructure that exists because of their desire to manage data and ability to do it effectively. If you look at where the venture capital partners are investing their money today, you’ll see that it’s in companies who are busy creating that layer of integration between the front end and the back end because they have determined that the benefits of having a big data strategy greatly outweigh any costs.
The case for agile and self-service BI
Recently, our team was on a call with a client who was trying to consolidate dozens of transactional systems into a single model to support a more effective reporting paradigm. The envisioned solution focused on self-service, visual analytics, while also supporting more traditional reporting. This client’s challenges were similar to what many other businesses face today. They wanted: Quicker time to insight Empowered end users Lessened dependency on IT Reduced reconciliation of reports, etc. Sound familiar? The client wasn’t questioning whether or not there was value in the project ahead. Their questions were focused on the best approach. Do we pursue a big bang approach or pursue something more agile in nature? Upon further discussion and reflection, the objectives of the program seemed to be a perfect case for agile. Let’s talk about why. Iterative selling of value While the client knew the value of the project, we discussed how, in reality, data projects can die on the vine when the value isn’t apparent to the business funding the initiative or to the IT executives who need to demonstrate their operational ROI. As such, the ability to demonstrate value early and often becomes critical to building and keeping the momentum necessary to drive projects and programs across the finish line. Project sponsors need to constantly sell the value up to their management and across to the ultimate customer. Iterative wins become selling points that allow them to do so. Know your team’s delivery capability To truly understand what can be delivered (and by when) means accurately assessing how much work is in front of you and how quickly your team can deliver with quality. This example project was as new as the client’s team. For them, the most logical approach was to start doing the work to learn more about the work itself as well as the team. After a few iterations, the answers to the following questions become clearer: Parametric estimating – How do I estimate different complexities of types of work or data sources? How do I define the “buckets” of work and associate an estimate with each? What values do I assign to each of these buckets? Velocity – How quickly can my team deliver with each iteration? How much work can they reliably design, build, and test? Throttling – What factors can I adjust to predictably affect velocity without compromising quality or adversely affecting communication? Continuous improvement – Fail fast, learn fast, adapt. Do I understand what factors are impeding progress that I can influence? What are we learning about and how are we accomplishing the work so we can improve going forward? How do we get better at estimating? Team optimization – Do I have the right players on the team? Are they in the right roles? How does the team need to evolve as the work evolves? Foster trust – ensure adoption Anyone who relies on data, whether they are business or IT, has their go-to sources that they rely on. Getting an individual to embrace a new source for all of their information and reporting needs requires that the new source be intuitive to use, performant, and above all, trustworthy. As with any new solution, there will be skepticism within the user community, and whether conscious or not, an unspoken desire to find fault in the new solution, thereby justifying staying with the status quo. Data quality and reliability can be the biggest factor that adversely impacts adoption of a new data solution. By taking an agile, iterative development approach, you expose the new solution to a small group initially, work through any issues, then incrementally build and expose the solution to larger and larger groups. With each iteration, you build trust and buy-in to steadily drive adoption. Generate excitement By following an iteratively expansive rollout, genuine excitement about the new solution can be fostered. As use expands, adoption becomes more a result of a contagious enthusiasm rather than a forced, orchestrated, planned activity. Tableau’s mantra for many years has been “land and expand” — don’t try to deploy a solution all at one time. Once people see a solution and get excited about it, word will spread, and adoption will be organic. Eliminate the unnecessary While there are many legitimate use cases for staging all “raw” data in a data lake, concentrating on the right data is the appropriate focus for self-service BI. The right data is important for ensuring the performance of the semantic model, and it’s important for presenting the business user with a model that remains uncluttered with unnecessary data. Agile’s focus on a prioritized set of user stories will, by definition, de-prioritize and ultimately eliminate the need to incorporate low priority or unnecessary data. The result is the elimination of wasted migration time and effort, a reduced need for the creation and maintenance of various model perspectives, and ultimately quicker time to insight and value. Adjust to changing requirements and priorities Finally, it’s important to understand that data projects and programs focused on enabling enhanced or completely changed reporting paradigms take time to implement, often months. Over the time period, priorities will likely change. An agile approach allows you to reprioritize with each iteration, giving you the opportunity to “adjust fire” and ensure you’re still working on the most important needs of the end-users. Ready to roll out a successful self-service business intelligence program and not sure where to start? If you’re ready to take the next step, we’re here to help.
Align digital analytics with your business strategy
With increasing customer demands and competition a click away, access to data-driven responses in real time has become a necessity for business users and marketers. Today, the market is full of digital analytics tools for measuring your customer experience across web and mobile applications, customer relationship management (CRM) systems, and point of sales (POS). When used correctly, these digital analytics tools can provide businesses with a wealth of insights into the performance of their digital platforms. To best leverage digital analytics, you will first need to set clear business objectives and define how your organization intends to measure success on your digital platforms. An in-depth look into your current measurement strategy, if one exists, including metrics and key performance indicators (KPIs) will reveal if digital analytics are providing the data and insights necessary to ensure business and customer needs are met. Identifying metrics Organizations often make the mistake of using out-of-the-box metrics like page views, sessions, bounce rates, and session duration as KPIs. These basic metrics are not representative of actual business objectives and can prove useless without the right context. For example, if a marketer wanted to understand the value of a landing page, they would want to look at the number of leads generated by the page or the long-term business impact that customers who came to the site through that page brought. Instead, reports focus on the number of people that saw the page or the bounce rate for the content. While this is helpful information, it doesn’t mean anything if you can’t tie the analysis to ultimate business success. Regardless of industry, website visits and page views do not increase bottom lines, nor should they be used as KPIs. If these are the types of metrics you’re seeing in reports or using in your analysis instead of relying on KPIs like leads, transactions, revenue, or conversions, then it may be time to re-examine your digital measurement strategy. Developing a measurement strategy Successfully integrating digital analytics into business processes requires a clear measurement strategy. A measurement strategy outlines business objectives, what should be tracked on the website or mobile app that will inform these objectives, the types of reporting that will be available, and to whom it will be exposed to once the implementation is complete. Depending on current processes, analytics tools, technology, and available resources, the process of uncovering this information can take several months, but it is a vital step that should not be overlooked or rushed, as the end result is a digital measurement model that provides the framework to align digital analytics with business strategy. The digital measurement model A digital measurement model is a high-level, visual summary that links your core business objectives, such as increasing brand awareness, customer acquisition, or increasing sales, to the digital strategies used to achieve these objectives and their requisite goals. From there, specific KPI and targets will be identified for each digital strategy, helping business and marketing stakeholders understand whether their efforts are trending in the right direction. These elements should be captured in a matrix that can be used to inform the tracking strategy, reporting development, and ultimately gauging the health of your digital practice. Benefits of a measurement strategy Creating a measurement strategy that aligns your business goals with the activities of the digital teams can have a significant impact on how the business operates. With clearly defined objectives and KPIs for measuring digital outcomes, digital teams can focus their efforts on producing measurable value, instead of opting for a shotgun approach that hopes some portion of their efforts will drive outcomes. A well-defined digital measurement strategy encourages an environment of accountability. With KPIs to measure the gap between real-time digital outcomes and targets, executives gain greater visibility into the progress (or lack thereof) being made toward business objectives. It also creates a baseline of expectations, helping digital team members to better prioritize work to produce measurable value. Most importantly, developing a digital measurement strategy gets people talking. Shaping strategy to reflect business objectives encourages collaboration among business operatives and leaders across the board, from marketing analysts to the CMO. Gaining alignment on what matters most helps an organization instill confidence in teams and helps team members gain a better understanding of how their day-to-day work contributes to the overall mission of the company. Final thoughts All too often, digital analytics are completely overlooked within marketing teams. This could be due to lack of expertise around robust measurement implementations, or analytics has been under-prioritized in favor of more tactical activities. Whichever the case, overcoming hurdles to generate actionable business insights from your digital platforms is vital to the health of your digital practice and the needs of your customers. To successfully leverage digital analytics, organizations need to take a deep look into their current measurement strategy and reframe as needed to align their implementations with their established business strategy. Ultimately, having a clearly defined digital measurement strategy paves the way for receiving lasting, meaningful insights from your digital platforms and provides a system of accountability for team members and leadership to unite around.
How to build your digital analytics capabilities
When digital analytics do not produce useful outcomes to inform business decisions, digital teams often point to reporting and analysis as the culprit. But an in-depth investigation of the measurement strategy and digital analytics implementation often reveals a much different truth: digital marketing teams often don’t fully understand the capabilities of the tools at their disposal. This common issue is born out of inconsistent technical implementations, lack of analytics expertise within the team, or a general misunderstanding of the types of metrics and data that the business should collect. As long as digital teams maintain that reporting and analysis are at the heart of the issue, organizations will not be able to leverage the full feature set available with analytics tools like Google Analytics or Adobe Analytics. This leaves digital leaders questioning whether they should invest in more expensive or specialized tools to get the “right” data that will create new and incremental value. The often surprising reality is that an updated implementation with more focused tracking would suffice to provide digital teams with the valuable data they seek across all digital platforms (e.g., mobile app, CRM and point of sale). With the recent addition of tools like Google Data Studio (a lightweight BI dashboard tool) and Google Optimize (an optimization experimentation tool for the free Google Analytics suite), the vast majority of businesses don’t need a paid analytics solution. You can invest money usually spent on expensive data management tools into analysts or other digital marketing efforts. In most cases, the free versions of tools like Google Analytics and Google Tag Manager are more than sufficient for the needs of an organization, but teams don’t have a true understanding of the tools’ capabilities or don’t implement their tracking in a way that works within the limitations of the free toolsets. 9 questions that can clarify you digital analytics capabilities If you wonder whether your digital analytics toolset is up to par, start by asking some basic questions. The answers will divulge the true extent of your digital analytics capabilities and identify areas for improvement from both technical and expert standpoints. 1. Are we combining data that we already have about our customers with their on-site activities? Many digital teams who use only the out-of-the-box versions of tools like Google Analytics are unaware that the tools come with powerful custom features. Custom dimensions, for example, provide valuable context to information being collected. You might have data, including gender, zip code, customer segment, persona type, etc., about a specific user based on their digital profile. Populate these values into analytics code, along with everything that is tracked using custom dimensions, in order to create meaningful user segments that provide insights as opposed to just metrics. 2. How do specific sections of the website compare to others? Many teams stop their measurement at the page level. This is a natural inclination, considering all interactions are recorded with the single page associated with it. However, there are available features, such as content groups and custom dimensions, that allow you to combine the data from specific pages into predefined site sections or groups. You can then compare these page groupings to each other to understand how they impact conversion and acquisition. 3. How does a specific content type impact conversion? By using the aforementioned features alongside conversions and goal tracking, analysts can point out which content types have the greatest impact on conversion, user drop-off, and other key performance indicators (KPIs) reflecting business objectives. Organizations can then use this information to better allocate resources. For instance, if you learn zero percent of your blog traffic converts on the site, you need to shift resources toward a more conversion-friendly design, engaging content, or traffic sources that are converting. 4. Can your analysts easily set up event tracking or conversion tracking without developers? Teams should use tools like Google Tag Manager or Dynamic Tag Manager to implement their analytics platforms whenever possible. These platforms allow marketers to have control over what is tracked and how that data is expressed in the analytics tool. Many companies do not use these implementation tools. Instead, they still rely on developers to add code, adding significant time to the tagging process and deterring analysts from tracking in some cases. 5. What are our most efficient sources of traffic for conversion? Analysts should be able to relay the traffic sources that generate the most conversions and which sources are the most efficient at doing so. An SEM program may generate the most conversions, while organic search might have a drastically higher conversion rate. In these instances, it’s worth exploring what an investment in the organic search channel could do versus making a similar investment in SEM or social media. 6. How are key segments of customers converting against other key segments on the website? Analysts can create meaningful segments within the analytics tool to understand how different types of customers utilize the website. These segments can be developed using existing customer data or even basic demographic data, like age and gender. Segmentation yields the necessary data to understand how different types of users are being impacted online to help identify areas for improvement. 7. Can we run A/B or multi-variate tests on the website today? A key part of optimizing your digital strategy should be conducting experiments with your website or app content. Many analysts don’t invest in the development of a testing program because of time constraints or simply because they aren’t aware of how easy it can be to run A/B or multivariate tests. Tools like Google Optimize are free and provide a robust feature set that integrates with Google Analytics and Google Tag Manager. 8. What are the primary drop-off points for customers prior to conversion? Analysts should have a clear understanding of what keeps users from converting on the website, no matter the type of site or conversion. With goal funnels or some elbow grease and expertise, analysts can identify where users drop off the site and what might be impacting their experience. With this visibility into customer needs, you can optimize the user experience and generate more conversions. 9. What are the main reasons users come to our site? Often teams don’t take time to understand the specific intentions of why users come to the website because the team members assume they know the reasons. However, their assumptions are often based on their own internal knowledge of the business. For example, you may see a significant increase in traffic to the website, only to find users are going to the careers page or reading a specific piece of content that doesn’t necessarily impact conversion. By understanding keywords, traffic sources, and landing pages that drive users to certain parts of your site, analysts can create user segments based on their intent. Understanding your digital analytics capabilities is the first step Uncovering your digital analytics capabilities can be the difference between a measurement strategy that ensures the continual improvement of online customer experiences or an expensive data tool that produces outcomes non-representative of business objectives. Before deciding to invest in a data management tool, assess whether your digital team has the expertise to leverage your current analytics tools. In most cases, we find the free versions of tools like Google Analytics and Google Tag Manager are sufficient for the needs of an organization. By getting answers to the right questions, you can discover your organization’s hidden capabilities and begin to leverage digital analytics to meet customer and business needs. Want to explore your organization’s digital analytics capabilities or dive deeper? Let us know.
CCPA data privacy enforcement has begun: Use our checklist and plan to make sure you’re compliant
How your bank can use machine learning to ensure future success
Amazon, Netflix, Airbnb, Uber, and other disruptors have raised the bar on what customers expect from a business. These online giants have figured out how to use their customer data to make personalized recommendations and predict when customers are going to buy — and present offers at just the right time. Brands that use personalization report an average growth of 20% in sales (Monetate research), and customers feel less spammed and more like they’re in control of the experience. It’s no surprise that consumers are looking for that same personalized, frictionless experience when interacting with their financial institutions, whether through mobile banking, your website, at a brick-and-mortar branch, or at one of your ATM locations. And it pays off for banks who can engage their customers. According to a 2013 Gallup study, fully engaged customers bring in an additional $402 in revenue per year to their primary bank, as compared with those who are actively disengaged. Even better, the research said 71% of fully engaged customers believe that they will be customers of their primary bank for the rest of their life. That could be your bank, but only if you can reach your customers in ways that feel natural and valuable to them. Customers want to be engaged with the right messages at the right time Imagine if you could understand your customers so deeply and predict their buying patterns so clearly that you could deliver targeted marketing only to those ready to invest in more products with your bank. Not only that, what if you could know what to say to them and on which channels to reach them? How would that impact your business? The trend is clear: financial institutions must adopt a customer-centric business model now to ensure success in the future. This puts banks like yours at a crossroads, and the problem is where and how to embark on that journey. Tackle your greatest challenges The formula seems simple. Increase your engagement and you’ll increase your revenue. But meanwhile, you’re under pressure to acquire new customers, maintain your base, forecast/reduce risk, manage capital, navigate security compliance and financial regulations, and optimize the business. You may also grapple with siloed data, legacy systems, and outdated processes, all seemingly monumental challenges that may adversely affect your customer experience. For example, your customers and employees may not have access to the right data at the right time to provide an optimal experience. Or, from a marketing standpoint, different departments within your company may be targeting the same customers, resulting in too many emails. Or your customers may get untimely messages about promotions that have passed or receive communications that don’t apply to their current situation. This creates frustration and a poor user experience that may be enough to make your loyal customers turn away. Other banks have been in your shoes, facing the same challenges and fears, but they’ve made major strides in putting the focus on the customer. They’ve found success through the “magic” of machine learning (ML). ML enables your staff to prioritize your over-capacity bankers’ focus and marketing spend on opportunities that are real. ML is a modern technique that uses algorithms to analyze enormous amounts of data. Machine learning models learn on their own and identify insights and patterns to predict future behavior. Machine learning algorithms connect the dots far faster and deeper than people can, exposing patterns in your customers’ behavior that empower your team to take actions that will impact your business’ top and bottom lines. Unlike traditional analytics tools, ML can evaluate account holders, securities, and transactions in real-time. If you want immediate decisions integrated in the moment, machine learning is the answer. And, good news, even though you may feel you are behind the curve right now, you have something that the younger fintechs you compete against don’t − a wealth of historic data that can be “mined” by ML to answer your specific business questions. Some organizations need help in improving the quality of their data for effective use in the machine learning model, and that’s not an uncommon challenge. But good data will be your key to success. Machine learning applications in finance Banks have found many successful ways to leverage machine learning. For example, they use it to answer specific business questions across all departments, including: How do I increase my customer wallet share, including: What are my best opportunities for cross-sell/remarketing my existing customers? Can I identify customers that we can convert from other banking institutions? Can I identify loan-default risk early enough to take an action? Can I dynamically price securities based on investor demand and market saturation? Can I predict my cash and reserve activity to optimize liquidity levels? Can I identify account holders’ attrition activity before they disengage? What percentage rate and product messaging would make my ideal prospect buy? The first step towards engaging customers with the right messages at the right time is to capture what questions your bank wants to solve. With these questions in hand, you can move to the next step, seeing how much predictive value these “use cases” for machine learning will give your financial organization. Case in point, these very questions are how it started for a large, institutional bank sitting on decades of financial transaction data we worked with. They wanted to more accurately predict member activity and drive better returns on cash reserves – and leveraged machine learning to do it. Our machine learning model identified patterns in their transactions, which spanned hundreds of credit unions and billions in cash to predict the deposit activity of millions of credit union members on a daily basis. The result? We freed $40 million in excess cash reserves. The insights gleaned also empowered the organization to pass on greater returns to members by selling short and long-term securities, arbitrage, and reducing borrowing fees. Another institution, Primary Financial Corporation (PFC), found great success using machine learning to improve their sales targeting. PFC wanted to predict CD issuers’ funding needs and institutions’ desires to invest. They developed machine learning models that synthesized PFC’s financial and competitive data to price securities, identify buyers, and project trade profitability. By the time the first phase of the project was complete, PFC could predict with over 80% accuracy and 70% precision the likelihood of a particular investor to buy a given investment. The common thread in these stories is that both organizations had an abundance of historic data at their fingertips, but they hadn’t explored how ML could help them retain more deposits, sell more products, or reduce their financial risks. The rapid predictive insights that machine learning continues to provide to both companies has been game-changing. And both are now exploring other ML applications. Get started Machine learning is widening the gap between banks who embrace it and their competitors who haven’t. If you don’t improve your banking experience, your customers will turn to another bank or even be serviced by a fintech. As you navigate how to become that customer-centric organization you want to be, explore machine learning as a way to get you closer to your customer and see rapid results. Start by coming up with specific questions that your business needs to answer, and take time to learn more about what machine learning can do in your organization. Contact Fusion Alliance to discuss if ML is right for your project. ON-DEMAND WEBINAR: Learn how to turn data into insights that drive cross-sell revenue
How to nail data-driven design: optimize like you’re curing heart disease
A 2019 Salesforce study reported that 84% of 8,000 consumers and business buyers say the experiences provided by a company are as important to them as its products and services. If you have a digital property that drives revenue, data-driven design can help create the best customer experiences possible around your products or services. To understand the value of modern, data-driven design, we can think about it in terms of a much more ancient challenge: heart disease. It’s true. In 2013, researchers reported finding evidence of atherosclerosis in Egyptian mummies, even though their civilizations didn’t have fast food and cigarettes. (However, I can imagine a peasant farmer’s stress could reach dangerous levels if his figs weren’t plump enough for the noble he served.) Modern medicine has come a long way in identifying and treating heart disease, thanks in part to the collection and analysis of data. Similarly, data-driven design is the informed, mature approach to product design. It is the approach you take when you want to use past experiences and real-time customer data to boost ROI. Whether you’re pursuing buy-in from fellow decision-makers or educating your team on the importance of data analytics in product design, heart disease research provides a useful and enlightening analogy to understand and talk about data-driven design. To consistently deliver an experience your customers will love, let’s examine how the scientific method has played out thus far around the health of that organ that makes love possible. Scant data, but progress nonetheless Heart disease innovation: Circa 1500, Renaissance man Leonardo da Vinci is among the first to describe atherosclerosis, saying “Vessels in the elderly restrict the transit of blood through thickening of the tunics,” according to the National Center for Biotechnology Information. In 1768, William Heberden describes symptoms, such as chest pain associated with walking. To kick off heart disease research, these men and others gather bits and pieces of data that can be organized into neat yet rather disconnected silos: physiological descriptions, symptoms, and potential causes. They are recognizing and reporting on what is happening, but the “why” still eludes them. Your data-driven design: During the earliest stages of building customer experiences, you rely on anecdotal evidence and trends in your market to guide decisions around product design and launch. You have few, if any, paying customers. With a prototype, you can test usability to ensure tasks within the software can be completed. At the MVP or beta stage, you gather quantitative data from system analytics (e.g., Google Analytics or Adobe Analytics). Likely, these pieces of data are dispersed among siloed teams: marketing, UX, and IT. In this earliest stage of data-driven design, you begin to see what is happening, but not why. Data collection increases and patterns form Heart disease innovation: Research around heart disease is no longer a rarity, and the rate at which new information is uncovered is picking up the pace. In 1856, the “father of pathology,” Rudolf Virchow, defines what makes up a blood clot in the vascular system and begins to identify risk factors. Building on the science that came before him, he develops concepts that remain relevant today. His work is a turning point, as scientists begin to think about clinical implications and how to serve patients (aka customers). They’re seeing valuable connections among physiology, symptoms, and causes. Your DDD: Your product has been in the marketplace long enough to gain traction. Your team makes some design improvements based on best practices and an awareness of your users, but you still don’t have a deep understanding of them. But data is coming in fast, and that’s a good thing. To get closer to a more balanced picture of your users, you launch quantitative and qualitative data collection methods, such as multi-source analytics tools, surveys, contextual inquiry, usability studies, or diary studies. Patterns begin to form, but without testing, they don’t tell you enough to make meaningful design changes. You realize how marketing, UX, and IT all matter to developing a holistic, aligned picture of your customer. Major successes and serious reflection Heart disease innovation: Prolonged study is leading to valuable progress. In 1958, Dr. Mason Sones of Cleveland Clinic successfully threads a tube into a patient’s arteries. It’s the first iteration of coronary arteriography, and the resulting images offer scientists real evidence to diagnose angina (chest pain associated with heart disease). Subsequently, two radiologists simplify Sones’ technique, making it more accessible to more patients. This breakthrough paves the way to modern advances in diagnosis, disease management and treatment. Your DDD: Optimizations to the design of your customer experience are proving successful, and stakeholders are happy. Congratulations! But you don’t rest following an outstanding quarterly performance. You continually mature and act on your understanding of the analytics. During this more mature stage of data-driven design, you establish a steady cadence for data collection. A quarterly plan is being carried out, and a year’s worth of data has been gathered that reaches across departments, breaking down silos and instigating rich, ongoing conversations about what to design and why. Multiple perspectives feed overall insights. With an intentional plan and effective measurement strategies in place, now’s the time to experiment. You avoid getting stuck in a design rut by testing your current site against previous versions through multi-variate or A/B testing. You might even consider deeper dives into personalization and using machine learning to interact with customers based on their purchase history or content engagement activity. Heart Disease Innovation: Through decades of research, experiments, clinical trials, and life-saving wins, the stage has been set for even more advances. Surgery to widen arteries helps tens of millions of patients. Medical therapies, angioplasty, and stenting techniques evolve. Treatment plans include personalized lifestyle changes. And relationships to other conditions, such as diabetes and obesity, shed more light on heart disease. From da Vinci’s early observations, we have reached an age of increasing wisdom, as data and the amalgamation of it grows more and more robust. Your DDD: At this stage of your data-driven design pursuit, you’ve chosen the right tools to leverage the insights you’ve gathered. Because you have invested in data collection and followed the fundamentals of data-driven design (balance, cadence, conversations, and perspective), you can pursue business goals more strategically. Data might have even uncovered an opportunity to spin off a new product or shift into a new market. Nail It! No matter the health of their hearts, your customers expect an outstanding digital experience, which requires you to have a rich understanding of who they are. A data-driven design approach allows you to build and continually improve that experience throughout the entire lifecycle of your digital property. Getting started doesn’t have to be complicated. Begin by testing what’s available, and mature your analytics from there. As in the scientific method, you’ll experience setbacks as you build and launch products. But when you heed the data, you will design a smarter path forward.
Customer retention strategies for banks: 5 tips for keeping the wealth in your institution
A massive storm is brewing in the banking, financial services, and insurance industries, and when it strikes, it will be devastating to the unprepared. That storm is the unprecedented transfer of wealth, $3.9 trillion worth, that will be passed from the hands of older generations to younger in the next eight years or so. The rains have already started to trickle, but when they come in full force, if your organization hasn’t already connected with younger generations, you’ll see millions of dollars in wealth walk right out your door. If your bank doesn't have a plan in place for customer retention, it’s not too late to take action. Consider that millennials (born circa 1981-1997, also called Gen Y), are now the largest generation, accounting for over 25% of the population. They are followed by Gen Z (born circa 2000-present), those born with digital devices in their hands, who comprise more than 20% of the population. The potential purchasing power of these generations combined is something that can make or break banks, wealth management firms, and insurance companies. Yet most businesses in these industries still don’t have a game plan to connect with an entire population. Will your company be different? The problem is complex, but no matter where you stand, a solution is within your reach if you create a strategy informed by data and insights that has a clear road map to success. Here are five tips to building successful customer retention strategies for your bank, so you can emerge strong on the other side of the impending wealth transfer. 1. Understand the challenges of banking for millennials Recognize that this is a whole new audience you’re dealing with. The old ways won’t work in the new economy of connected consumerism. A 360-degree view of your current customers will help you gain insights into what the older generation wants, but with an eye towards the future consumers of your brand. They’re not like baby boomers (born 1946-1964) or Generation Xers (born 1965-1979). This newer generation sees things differently than their parents and grandparents did. Get to know this younger audience on their terms and understand why they have different belief and value systems, why they view traditional institutions skeptically. Examine the world from their eyes. They’ve seen that industry giants who their elders once perceived as invincible (e.g., Lehman Brothers) are now gone. Or that others, like Wells Fargo, AIG, and Countrywide, had to be rescued by the government from the brink of bankruptcy, with taxpayers footing the bill. They’ve seen the effects of parents being laid off after years of loyal service to a corporation. They know families who lost their homes when the housing bubble burst. Can you blame them for being leery of traditional institutions? An Androit Digital survey examining millennials’ brand loyalty reported that 77% said they use different criteria to evaluate brands than their parents do. Are you aware of what criteria they are using to evaluate your brand? If not, you need to arm yourself with answers. Research shows that younger generations frequently turn to friends, independent online research, reviews, and social media for decision making. For example, an astounding 93% of millennials read reviews before making a purchase, and 89% “believe friends’ comments more than company claims,” according to an IBM Institute for Business Value survey. Your future hinges on understanding these behaviors. A report by Gallup on the insurance sector revealed, “Millennials are more than twice as likely (27% vs. 11% respectively) as all other generations to purchase their [insurance] policies online rather than through an agent.” Online purchasing is far from the mainstream among insurance consumers overall: “74% originally purchased with an agent vs. 14% online – but if this trend among millennials continues to grow, it could substantially change the way insurance companies interact with customers in the coming years,” the report stated. Likewise, “Banks are losing touch with an entire dominant generation,” according to Joe Kessler, president of Cassandra Global. The Cassandra Report cited that 58% of young adults said they would rather borrow money from friends or family instead of a traditional institution. Two-thirds of the respondents said it is “hard to know where to learn about what financial services they might need.” In other words, when it comes to banking, millennials don’t know who to trust. Begin the process of getting to know this younger clientele by conducting research that will help you gain insights into what they stand for, how and where they interact, and what their expectations are of your industry, your company, and your brand. By evaluating that data, you will be able to set the process for communicating with and building different ways to engage with these new young consumers. Your interactions and communications must be seamless and easy and reflect that you can talk in their terms. You’ll need to look at this emerging demographic with a “digital lens” because this is how millennials engage with brands. What are those channels, what are their preferences? What other services can you make available in a seamless and frictionless and customized way? If you don’t take the time to get to know your audience, you won’t be able to lay the foundation for a successful strategy to engage them. 2. Make young customer retention your bank’s primary mission Younger generations, millennials especially, are driven by a different set of values. They want a work/life balance. They like to donate money. They don’t want a lot of stuff. They like to travel. They want to experience life. They question long-standing rules that don’t make sense to them. So, develop your business strategy around a purpose or a mission – one that they will connect with. Build upon the information you learned about your younger customers in tip #1, then map this customer’s journey with behavioral analytics. Evaluate the digital channels and content that your younger clients find compelling. Now you can create a strategy and roadmap to engage these customers. 3. Build your customer experience for different audiences A strong customer experience (CX), one that creates loyalty, is one that is personalized, timely, relevant, appropriate, and built on trust. The more customizable the user experience, the better. According to Janrain, 74% of online users are frustrated with brands that provide content that doesn’t reflect their personal interests. You know users want to be recognized on their terms, but you have a problem. How do you build a single CX that addresses vastly different generations with different behaviors and interests? Is there a way to reconcile their differences via a single CX? The answer is no. For the time being, you need to develop both. If someone tells you differently, beware. Think about it. In wealth management, banking, and insurance, the older generation still holds the money and keeps the lights on for your business. The newer generation will get that money within a decade, but if you go full-throttle and build a single, mobile-first CX, you’re going to alienate the people holding the purse strings. In the next few pivotal years, your bank’s customer retention will be heavily dependent on how well you address each audience on their own terms. How to cater to older generations Older folks prefer offline channels, like walking into a branch, agency, or brokerage firm. They like to do business face to face or via phone conversations with tellers, bankers, agents, and wealth advisors. Online, they like having a “control panel” style experience on a desktop, such as you might find with financial trading platforms. This is how you build trust and timely, relevant, personalized experiences. Online, build a web portal to appeal to the interests, needs, and communications preferences of the older generation. The younger generation will use the web portal now and then, but that is not going to be the experience they associate with your brand – because you’ll give them their own. How to cater to younger generations Give the younger generation mobile apps and SMS communications. With over 87% of millennials saying they are never without their phone, this is where you should reach them. They have no interest in stepping foot in a building that feels like an institution or talking to some random agent, broker, or salesperson when they can do everything quickly and effortlessly on a mobile device. Take the information you learned in tips #1 and #2 and build strong loyalty, providing timely, relevant, personalized, and appropriate experiences on a digital dimension. As you build a CX specifically tailored to banking for millennials, you’ll find you can gain loyalty on their terms because you’ll be able to interact in a more agile, nimble, and personalized way. The older generation will probably use the mobile app for simple tasks like checking information and balances, but they’re going to associate their comfort with your brand with the CX they use most – the desktop. Two CXs could be the right solution for today’s transitioning market, but keep in mind that there are additional channels through which you can build loyalty with these younger audiences across the digital landscape. For example, you can share educational, informative content through social media channels. 4. Knowledge transfer to the younger generation Everyone in wealth management, insurance, and financial services already has a foot in the door with the younger generation. That connection is the strong relationship between existing older customers and their offspring. Leverage it. First, understand that the older generation wants to take care of the younger ones by leaving money to them, but they are worried that the next generation doesn’t have the knowledge or discipline to hold onto and grow that money. There are so many stories of young people, like athletes or celebrities, getting rich quickly, getting bad advice about money, and then squandering it all away. What if their children make the same mistakes? Help address that fear and protect those kids by arming your older customers with educational tools on how to prevent this from happening. For this CX, you’ll need to develop portals and educational content, manage and market that content, and make it come to life in an updated website (geared to the older generation) that features whitepapers, articles, or videos, such as “Talking to Your Children About Money 101” and the like. Educate this audience on how to talk about the benefits of insurance or long-term investment strategies and provide them with incentives to set up meetings with themselves, their offspring, and you. The younger generation isn’t interested in talking to an institution, but they will listen to the advice of the parent or grandparent giving them this money. Let the parents and grandparents have meaningful conversations that hold much more weight than your business sending a bulk email to junior that says, “Invest in an IRA.” Now when members of the younger generation, the recipients of transferred wealth, decide to check out your company on the advice of their parents or grandparents, they will access your relevant app that speaks their language and addresses things of interest to them. They’ll soon figure out that you’re not some stodgy institution and will be much more open to a discussion when their parents suggest a conversation with your company’s brokers, advisors, or agents. This is how the knowledge transfer will occur organically, growing your bank’s customer retention along the way as you build a relationship of loyalty and trust. You not only will give the benefactors peace of mind that their offspring will be good stewards of their fortune when the time comes, but you’ll keep the money in-house because you took time to connect with and earn the trust of the young beneficiaries. 5. Make use of emerging technologies in banking to satisfy the ever-changing digital landscape At this point, you know you could benefit from two CXs. The web platform focuses on the needs and concerns of the older generation that holds the wealth today. The mobile platform addresses the younger person who will inherit the wealth, providing guidance, teaching the basics of how to invest or buy insurance, and will be chock full of quizzes, games, personalized spreadsheets, automated tools, and more. The challenge is that when the older generations pass on, the desktop experience will be moot. You don’t want to have to rebuild all the technology infrastructure that you worked so hard to establish. The answer? Don’t build applications or tools – build platforms for the future that can be adapted as the younger generation takes over and as mobile-first interactions become predominant five years from now. Don’t overlook the fact that more cost-effective emerging technologies in banking, such as infrastructure in the cloud, will be a necessary ingredient for success. Banks and insurance companies are reluctant to get in the cloud, but if you understand that most applications are going to be in the cloud five years from now, you understand the critical nature of developing these capabilities today. The cloud enables rapid changes to meet market and customer demands. It is flexible and nimble. You pay for what you use, can pay for service or infrastructure, and simultaneously increase security and reliability. To those unfamiliar with the cloud, security can be a scary proposition. However, with major cloud providers like Microsoft and Amazon employing an army of experts to ensure security and regulatory compliance, the cloud is safer from a security standpoint than most on-premises data storage. While 85% of companies using the cloud report they are confident their providers are able to provide a secure environment, 90% of IT managers reported they are not confident in their own companies’ ability to detect security problems internally. If you’re building a flexible technology platform with the right digital CXs, infrastructure that looks to the future and cloud capabilities, then your organization will be positioned for success when the wealth transfer hits in the next decade. Final thoughts on customer retention strategies for banks There are more than 75 million millennials out there spending $600 billion every year, and that number is only going to increase. They are graduating from college with massive amounts of debt, face a precarious job market, and are typically naïve about financial matters and insurance.The companies who aggressively work to offer practical tools and advice on banking for millennials are the ones who will outperform their competition in the future. It’s not too late, but you cannot wait to take action. If a business does not begin building the bridge between current wealth owners and soon-to-be wealth recipients until after the wealth-transfer process has begun, it will experience a devastating economic blow and get left behind by those who have embraced this shift. The ball is in your court Everyone has predicted that the landscape of the wealth management, banking, and insurance markets will change dramatically due to the digital disruption and younger generations, but with the right strategy in place, your organization can emerge as a leader. Look at this as an opportunity to differentiate. A digital strategy will be the key to your success. Don’t look at digital as an application. Digital is the way all future generations will engage and interact. Leverage it today and do it well to tie the present with the future. Your formula for success is to create an actionable plan that is both informed and driven by insights and data on what people buy, how, what they expect, how they feel, and whether the experience is personalized, relevant, and timely. You need to understand your audience and use those insights to feed a strategy that ties into the mission and purpose of your customers. Bring your strategy to life in a digital channel that sits on top of flexible technology. Measure your customers’ experiences and level of engagement with your brand, and then make adjustments, continually working off of research and data. Follow this formula, and eight years from now, you’ll be the organization that is reaping the rewards because you understood how to keep millions of dollars from leaving your company. Need help improving your customer retention in banking? Let us know.
Top 3 reasons to invest in machine learning for mobile
Artificial intelligence (AI) and machine learning (ML) have completely transformed mobile development. Mobile app users today are often looking for an easy and relevant user experience — one that has been customized for them. The best way to get there? Machine learning. Machine learning identifies anomalies and patterns that ultimately optimize the user experience. If your technology conversations have stalled at the brainstorming or ideation phase, consider why. If you don’t have a clear answer, you’re not alone there either. “Strategic decision makers across all industries are now grappling with the question of how to effectively proceed with their AI journey,” says Marianne D’Aquila, research manager, IDC Customer Insights & Analysis. Despite questions about how to proceed, organizations know they need to invest in ML for mobile before current competitors, and those waiting in the wings, figure out how to profit from it first. Considering the speed at which machine learning is being adopted and spreading, and its potential to quickly help companies on multiple fronts, the time for execution and implementation is now. Here are the top three reasons that make machine learning development for mobile important right now: 1. Machine learning for mobile increases app security “Facial recognition” ($4.7 billion, 6.0%) and “fraud detection and finance” ($3.1 billion, 3.9%) were among the top five categories of AI global investment in 2019, according to the AI Index 2019 Annual Report (an independent initiative at Stanford University’s Human-Centered Artificial Intelligence Institute). It’s not surprising. From TikTok’s recent security flaws to Target’s $18.5 million settlement, app vulnerabilities and potential data breaches are breaking news, and there are few signs of a slowdown. While the short-term financial impact can hurt, the long-term cost of losing the trust of customers and partners can be even more painful. Companies that receive users’ personal information (e.g., passwords, billing addresses, answers to security questions) for processes such as app authentication or making purchases must continually optimize how the data is used. Through machine learning and automating parts of the process, you can identify anomalies faster, allowing you to see patterns and manage potential weaknesses more quickly. Operationally, ML can detect and staunch security issues related to data inside your company, such as logistics or pricing anomalies, that could be a drain on resources. For example, if one of your products is selling faster than usual via a shopping app, it could be related to a pricing error. Do you really want that $450 device on sale for $4.50? The mobile application landscape is comprised of a wide variety of operation system versions, devices, and software systems. This creates a much greater number of attack surfaces that attackers can target. (A first step to optimizing security is risk evaluation and awareness. Contact Fusion to hear more.) 2. Machine learning leads to increased mobile privacy It could be argued that the recent news cycle around privacy indicates a real desire for clarity, if not outright skepticism. In more than 3,600 global news articles on ethics and AI from mid-2018 to mid-2019, the dominant topics were “framework and guidelines on the ethical use of AI, data privacy, the use of face recognition, algorithm bias, and the role of big tech.” You’ve heard about Russia’s role in the 2016 election and the use of personal information for ad targeting. These sorts of debacles haven’t led consumers to give up on digital. Instead, they are demanding more privacy oversight and are being more cautious about the apps they use. Privacy concerns are complementary to security issues. While security comprises keeping personal data from hackers, trolls, or criminals, privacy is more related to keeping personal data in a person’s own hands, away from any individuals or organizations that don’t need to be privy to it. For example, if you use an activity tracking app to record runs, you might appreciate a note when you hit a milestone: “You had a personal record today!” Machine learning makes it possible for the mobile app to directly detect this activity and send a congratulatory message without any human intervention. There’s no need for a stranger to know you clocked a fast 10K. Machine learning on the edge further increases privacy by eliminating the need for data to be sent to the cloud. When ML on the edge is in place, individualized data never leaves the device, keeping the user’s personal information in their own hands at all times. Amazon, Alexa, and Google Home employ ML on the edge, as some functions are offloaded to a device while others have to go to the cloud. In addition to supporting privacy, the reduced travel time for data makes these apps and devices faster. 3. Machine learning for mobile helps create personalized customer experiences Consumers expect their demographic, behavioral, and other personal data to be secure and private, while they also want increasing levels of personalization. Delivering on these demands can be a delicate, real-time balancing act for companies, but machine learning helps make it possible to juggle data acquisition with protection and those prickly questions around how to use the data to everyone’s advantage. But is there a clear business case to pursue personalization? According to a 2019 Salesforce report, the answer is yes, as 75% of 8,000 consumers and business buyers surveyed expect companies to use new technologies to create better experiences. Machine learning for mobile enables you to make user-experience headway on several fronts. First, it can help you build a baseline of customer app usage. Once you have that baseline, you can see patterns in user behavior. Next, particular behaviors or deviations from the baseline can trigger delivery of a relevant coupon, suggested product to explore, or a reminder to revisit an abandoned shopping cart. Even more sophisticated, ML can serve up colors, screen layouts, and language that appeal most to a particular user. And with machine learning, the reactions are in real-time. The more your user engages with your mobile app, the more refined and personalized the experience becomes. Through machine learning, your brand becomes more closely aligned with the customer experience that your customer desires. Getting started can feel uncomfortable at first, but at Fusion, we’ve found that organizations often have low-hanging fruit ripe to benefit from machine learning for mobile. You just need to be able to see and then act on those opportunities. Working alongside you on this journey should be people who understand data science and machine learning, and who can uncover weaknesses to target. Now is the time to move forward on machine learning for mobile initiatives. Current market conditions indicate a shortage of professionals in machine learning and data science. Fusion fills this gap. If you’re interested in hearing more about machine learning for mobile, let us connect you with one of our experts.
7 remote work best practices for employees
Remote work. You were either suddenly launched into it because of COVID-19 or you’ve been doing it for a long time. Either way, when the states first began issuing stay-at-home orders this March, we released this comprehensive list of remote work best practices for employees, and it was a huge hit. Even seasoned remote workers told us that these tried-and-true tips have immensely helped their teams. Scan through. If you apply just a few, you’ll feel an immediate impact too. Why listen to us? Fusion Alliance has a decade of experience supporting clients with remote workers. We’ve got two decades of remote work experience with our own employees, and even before the stay-at-home orders were put in place, we’ve fast-tracked many companies to get up and running remotely to ensure business continuity for their clients. And we continue to provide them with remote support today. Our experience helping on the technology side (including advice, implementation, and support) is so extensive that we can do it seamlessly and rapidly. But it’s the human side that takes some adjustment. So, without further ado, comb through these tips, and you’re sure to find some nuggets of gold. Best practices that will help you right now 1. Your mom was right. Routine will help increase your productivity If you’re new to remote work, your normal routine has been disrupted. But routine is your friend. It helps create calm and increases productivity. It is purposeful and sets expectations, including your own. Be strategic and proactive to take charge of your new situation. Set aside time to re-create a structure for your workday, even if you’ve done it for months. Stop and re-evaluate. What do you want your remote work life to look like? What do you need to accomplish and what variables make remote work challenging? Here are some ideas: Replace your previous commute times with something you do for your own wellbeing. Exercise or engage in an activity that gets you mentally and physically ready to do and be your best each day. Start your day by looking at your schedule and making a short list of what you need to accomplish for the day. Cross items off your list as you meet these smaller goals. This helps provide a sense of progress. Set up and maintain regular work hours. Communicate these to everyone affected, especially since everyone’s schedules are changing again and again. Schedule the most important and challenging tasks at the time of day when your mind is sharpest. Tackle other tasks, like responding to emails, when you’re not as focused. Give your eyes, body, and mind a break. Stand and stretch every 60-90 minutes. Set alarms or timers to keep you on track. Schedule a couple of short breaks to look away from your computer, get up and moving, take a brisk walk, or have a little fun with others in your household. Distractions and productivity are often the greatest challenges at the beginning of this process. Statistics show you lose more than 20 minutes of productivity each time you are interrupted. To combat this: Decide not to immediately react to messages, chats, texts, emails, and calls (unless that’s what your job entails). Turn off or ignore notifications during selected hours of deep work. Be thoughtful about setting aside two or three times a day to answer these. Communicate this system to any relevant parties. A recent study shows that 85% of businesses say productivity has improved due to increased workforce flexibility, so be encouraged by that. Set up your own visual system at home where others in your family can see if it is ok to interrupt you. Some people use a tented card that’s red on one side (busy) and a green on the other (ok to interrupt). This works for spouses too. If you have children at home, let them know when you are available and when you are “off-limits.” (That said, we know that babies and children are experts at testing best practices. Have reasonable expectations of them and yourself. You can only control so much. ) When your workday ends, stop working and leave your work area. Setting limits isn’t just for children. If you’re checking messages or doing “smaller tasks” all night, it’s probably taking its toll on you and your family. If you’re the kind of person who will work overtime regardless, define set hours to work each evening, and tell your family. Be fully present to those around you when you’re not working. They might even like you more. 2. Set up your setup for success The right setup is critical to productivity and your ability to maintain a work-life balance. The following will help you mitigate the challenges that come along with this transition: Rethink where you’re working. Did you choose a place that’s comfortable, apart from the main traffic in your home, and free from distraction as much as possible? Try natural lighting and fresh air. Clean up your space, remove clutter, and make it a setup that is something you enjoy, not dread. Ensure that it’s a place where whatever papers and notes you left out will remain untouched when you leave the area. Choose an ergonomic chair and desk. You have to be comfortable to be productive. Organize everything you need for work in your “office” area and keep it there. That way when you leave your workspace, you leave your work too. Get your technology in order. Is your internet reliable? Do you have enough bandwidth to do your work? If bandwidth is an issue, try designating a time when your kids can play online games or watch videos. If you still have issues, ask your employer about your options. Use a noise-canceling headset with a microphone, and make sure it’s compatible with whatever online meeting solution you use. Test your video-conferencing tool. Set up your computer or camera to look you right in the eye. Check to see what items will show in the background if you are in a meeting. If it doesn’t look clean or work-appropriate, remove it. Stick to your normal morning routine (showering, dressing appropriately), and be mindful that you may have to jump on a video call at any point during your workday. Consider the expectations of your potential audience and how you should appear on a business call. 3. You’re in charge: Take charge of your mind, mood, and tone We’re all aware which brands stepped up or didn’t during this pandemic. That’s going to have long-term impact on them Similarly, your actions and attitude are noticed. Are you handling remote work professionally? Be a positive, encouraging force, and show your ability to adapt. Work collaboratively with colleagues and customers who are going through stresses similar to or worse than yours. Behave with empathy and patience because you don’t know what someone else might be going through. Remember that different family situations are not as conducive to working from home. If you hear a toddler in the background or see a spouse/child walk into view during a videoconference, understand that everyone is doing their best. You can’t expect the same environment as you would in the office. If your colleague seems to be operating in the midst of chaos, think about their situation, and interact with flexibility and kindness. 4. Communicate well and feel the difference As a remote worker or manager, your job is to continue to provide the level of service your company and customer are used to, with minimal disruption. Your ability to communicate well will make the difference. Be fully engaged and collaborative in your interactions. Make communications personal. Take time to ask colleagues about themselves. How are their families? What’s new in their lives? Initiate a conversation with your team to make strategic decisions about communication channels, daily standups or check-ins, and team meetings with videoconferencing to keep it more personal and connected. Set expectations. It’s amazing how few people have asked others what’s the best way to communicate with them. Have you told people the best way to reach you? Text, email, Teams? If you’re checking emails and messages only periodically on Microsoft Teams, WebEx, Slack, Zoom, Skype, or whatever other channels, let people know how often you plan to do that and what to do when they have an urgent matter to discuss. Make sure you’re all logged into your business collaboration tools or chat channels all day or at whatever times are expected. Update your calendar. Block out time for deep work on projects where you shouldn’t be interrupted. And tell your colleagues to keep their calendars current too. Schedule regular meetings to keep people connected, organically increase accountability, and give your team a chance to chat. If anything, managers need to be communicating more, not less, until they find a comfortable cadence. Leaders and managers, some words of advice: Make sure you call each employee on your team to check in and see how they’re doing. Start with a personal conversation and ask if they need any help professionally. It will go a long way if you focus on your employees, not the business. Make meetings fun and personal. Maybe add a giveaway now and then, even if it’s as small as a $5 gift card to a coffee shop. Change it up. Make sure all members of the team clearly know their own roles, assignments, and the common goals you are all working toward. Be intentional with your communication. Figure out what works best for your company and team. A lack of communication often results in employees feeling isolated and uncertain. There’s a fine line between communication and over-communication. Don’t cross it. Employees of larger companies often complain that they get the same forwarded email from multiple managers. Employee mailboxes are getting filled with redundant information. Leaders, make sure you send your messages to your entire intended audience to avoid overwhelming your teams with the exact same message again and again. 5. Online meetings don’t need to be useless: Get the most out of them Video meetings help you connect with your audience both visually and audibly. Be mindful of your body language and gestures. Create a positive, collaborative tone. Here are more tips: Add your picture to the meeting app. It’s more personal. Include your contact information and mobile number. If you are meeting someone new, precede or follow up with a simple message welcoming them to reach out to you. That’s a nice touch. Introduce yourself. In larger meetings, say your name when you begin speaking. And, if you are in a meeting with people you don’t know, state your name and provide context about your role. “I’m Lisa Jones, operations manager.” Be present. Are you multitasking or listening intently? Mute your microphone when you’re not talking or actively participating in the conversation. The sounds of your dog barking or your kids arguing can be distracting. Be heard. Share your thoughts and ideas. You are in the meeting because of your valuable knowledge and insights, so share them. Dress appropriately for your audience. 6. Be social for your own sanity To many, the lack of interaction with your colleagues is the hardest part of working remotely. With digital resources, you can remove some of that isolation. Use audio or a video chat to work together on something small instead of doing everything through chat. And have a little fun. Plan a happy hour now and then, where you all videoconference and chat over food and drink as if you were together. Have virtual lunch meetings with your work friends. If your company can do it, have lunch delivered to the home of each member of your team. Have a group chat where you watch a video, read a book together, or work on a community project. Schedule a time to play games using a website that facilitates team-building activities or games. Schedule some after-hours fun with some close work friends. If you have the technology in place, use it to your benefit. 7. Exercise and move: Stand up as you read this Are you more sedentary now that you’re not in the office? Get up and get moving. Here are levels of exercise you should attempt: Good: Take a quick break and stretch every 60-90 minutes. Set a timer so you actually do it. Take a walk during a call or lunchtime. Do jumping jacks, situps, or some kind of exercise every time you finish a task or stop to get a drink or use the bathroom. Better: In addition to the above, block out one or two 10-minute sessions to raise your heart rate and breathing with a brisk walk or jog, sit-ups, pushups, or anything else. Best: Do all of the above, and also take an extra hour a day for walking, jogging, cycling, yoga, etc. Mix it up. Create a long-term goal, use an exercise app, find a plan online. Summary There you have it. If you start applying these best practices now, and you’ll definitely feel the difference in your personal adjustment to remote work. Celebrate your successes and take time to breathe! Meanwhile, there’s a whole technology side too. If you need help with that or in acquiring and implementing tools like free Microsoft Teams collaboration software licenses, we’re just a click away. We’ve gotten clients’ remote workers up and running on VPN in a single day, and that’s only one example. We can do so much more. Our teams are activated and ready to help.
8 ways to manage your remote workers effectively
74% of CFOs and finance leaders said they plan to move at least 5% of their workforce to working from home full time, according to a 2020 Gartner survey. Nearly 25% said they’d move 20% of their staff to permanently remote positions. That’s one in five workers. Organizations ranging from a handful of employees to thousands abruptly shifted to a remote-work model across the nation due to the pandemic. But just because teams have worked remotely doesn’t mean it’s being done in the most effective way. Managers, you’re on the hook to expand your skills and show that you are you can now run a productive, efficient, well-oiled remote team. We’re here to help. Read on to discover small adjustments that will make a big difference starting today. If you need more information, we wrote a similar article about best practices for employees. It has slight overlap, but numerous suggestions you can pass on to your team members. Why should you listen to us? If you’re like most managers, you probably have little to no work-from-home experience. But even if you’re a pro at working remotely, managing an entire team of people new to remote work and collaboration technology creates a whole new set of challenges. We have the know-how to help. Fusion Alliance has decades of experience managing our own remote workers. Additionally, we help organizations set up and support their remote workforces from a technology and collaboration standpoint. And our work in this area has only ramped up since the COVID-19 outbreak. 8 tips to being a more effective remote-team manager 1. Take the temperature of your remote workers, figuratively Pandemic or not, the most effective managers have their finger on the pulse of what each team member is feeling and doing. In a remote world, you need to continue to gauge how your team members are adapting and offer help. This connection will fast-track you to helping everyone meet your shared goals. If you haven’t had time to check on your employees one by one, begin now. Instead of talking about their projects, start by asking how they are, and let them lead the conversation. How are they coping? How is the family? Is everyone well? Was anyone laid off? Is it challenging with others in the house? Ask if they have questions or concerns that you can help with. We bring this up because it’s amazing how many people say their managers have done little more than ask a quick “how’s it going?” and move on to the “important” stuff – work. Other managers are so overwhelmed by their own new workload that they fail to reach out all. Yet others don’t even address this adjustment and see things as “business as usual.” On the flip side, we’ve heard instances of senior executives of organizations taking the time to email individual employees to check on them. Imagine the difference that makes in making employees feel valued and creating loyalty. Despite being remote for several months, one area that employees still complain about is feeling unsettled about their workspace at home. This is a good conversation starter. If your company provides guidance on how to best create work-from-home workspaces, share them or the tips you find here. All of this will alleviate stress. 2. Provide ample communication When you work on-site, you learn a lot about what’s going on from coffee-area and hallway discussions. With that channel gone, here are ways to help your employees feel less siloed: Be more “visible.” Change your status on your collaboration tool to “available” as much as possible. Let your team members know you’re there to help if small questions come up as they work on a project. Give a daily time where you have “open hours” and can be interrupted. Even a half-hour will help. You’re privy to more information than the average worker. Keep your employees updated on what’s happening at a macro level in the organization. How is the company doing? How are different departments performing? What are the company’s greatest challenges? What is the go-forward strategy? Tell colleagues when someone gets hired, takes a new role, or leaves. Make sure every member of the team knows their priorities, goals, and understands their individual roles. If possible, see if some need more frequent one-on-ones at times. Be intentional with your communication. Figure out as a group what channels are best for communicating and how to reach each team member when something is urgent. Set expectations if you want everyone to have a certain channel of communication open all day. Confer with the team about how this can happen without interrupting everyone. Use the chat feature for quick questions instead of scheduling 30-minute meetings . Don’t forget to praise your team’s accomplishments in public. 3. Set a routine, and the change will be remarkable In our article, 7 remote work best practices for employees, the first best practice is about creating a routine. Look at these suggestions to take charge of creating a workday structure. They’ll help you move from reactive to proactive. Try to be consistent in your routine before and during work. Stick to a time when you’ll stop working so that you don’t burn out. Add exercise into the mix, including during work. Do some cardio every 90 minutes, whenever you switch projects, or stop to get a drink. If you’re a leader who has figured out a work-from-home routine and you’re coping reasonably well, be patient and give grace to others who aren’t in the same place as you. Some are alone and struggling. Others deal with: young children; students and spouses competing for bandwidth; no real office space that is free of distraction; friends and family who may have taken ill; financial difficulties that are overwhelming; and on and on. As time goes on, many people’s personal challenges compound. Do what you can to listen, prioritize work, or offer flexibility like shifting hours. Your genuine concern will help alleviate stress. 4. Reduce distractions A UC Irvine study found that it takes 23 minutes to refocus after being interrupted just once. Just as you tell a child to focus when they get off task, train yourself to manage and reduce work interruptions. You’ll instantly see a change in your own productivity. Turn off social media and other notifications on your phone during work hours. Minimize notifications on your computer. Take a couple hours where you work non-stop, ignoring incoming emails, instant messages, and such. Change your status message on your collaboration tools or email to say you’re “busy now, but free at x time.” Ask your team to keep their calendars up to date and to schedule chunks of time on their calendars to focus on specific tasks. Either they manage their calendars or someone else will fill them with meetings. Talk to your team about your availability and theirs, and brainstorm how to collaborate with one another. (This goes back to the previous points.) Try responding to emails and communications only a few times a day at specific times. Come up with your own personal system, communicate it to your team, and encourage them to create their own. Finally, have discussions about how to reach each person if something is urgent. The distractions caused by those in your household are a different story. Each family member or pet adds another variable. Have a discussion about your schedule and meetings to set expectations. Try putting a card near your workspace that says “busy” or “free,” a visual indication of when not to interrupt. Come up with something that works for your family. We acknowledge that visual clues often mean nothing to young children, but using the bullets above will at least help with work distractions. 5. Meeting collaboration and expectations If your meetings have always been in person in the past, you may need a different formula for virtual meetings. If you haven’t already done so, set up processes and expectations. Ask your team not to multitask so that you can keep meetings shorter, and make sure you’re doing the same. If you’re heard typing in the background during a meeting or if you’re not responding when someone asks you a question because your focus is elsewhere, don’t be surprised if everyone else begins doing the same. Start meetings promptly. Ask everyone how they are. Chat a little while. Decide as a team whether you need to meet on a more frequent basis. But make sure your check-ins are brief and provide value. Set up agenda of one or two things you want to discuss and no more. If you’re doing daily standups, set the structure up front. We’ve seen highly effective 15-minute standups with a team of 20. It works. Start promptly. Tell each team member to give a quick update only if needed or if they’re stuck. Otherwise, they don’t need to speak if they’re working on something they worked on yesterday. For daily briefings that include slides, use graphics and images as much as possible. Don’t read slides to your audience. Let them read it on their own, while you discuss only things that have changed since the previous daily meeting. This is a strategy called “brief by exception” that lets you provide pertinent updates without wasting time. In a standup, start with yourself to demonstrate what you want them to do. If someone is going on a tangent, jump in and offer to continue that discussion offline. You’ll get into a cadence if you keep people on track, and they’ll get the hang of it. Don’t schedule meetings more than an hour long. If your meeting must go longer, take a 10-minute break so everyone can stretch and change the scenery. If group meetings are important, record them. They may be searchable or have transcripts available to those who cannot attend or needed to step away. After any virtual team meeting, send a recap of important takeaways. Not everyone is able to be attentive with a child shouting in the background or if someone’s Wi-Fi is giving trouble. 6. Virtual meetings and videos — do or don’t? It’s a no-brainer that communication improves when you see a person’s gestures, expressions, and body language. Many companies suggest video calls, even under normal conditions. Video calls can reduce the need for travel, as well. That said, not everyone’s remote-work setup is conducive to video. If you want to have a meeting with video, tell the attendees in advance. Some people don’t want others to see their makeshift office or the kids running around in the background. Many collaboration tools allow you to blur the background or change it to something else, so recommend that option. But if someone is stressed about using video, take notice and don’t force it as the norm. 7. Create energy and preserve the organization’s culture Do your part to create a fun, energetic, engaged remote culture. Here are some ideas: Send the team a meme or a trivia question every couple days. On a group-communication channel, share something personal. Have fun. Laugh a little. At the end of meetings have someone provide a fun fact. Or make a statement, and ask people to guess if it’s true or false. Ask people a silly question like “chocolate or vanilla?” Change it up. Try virtual lunches, happy hours, or group calls with friends. But also be aware of device and work fatigue. Some people are tired of looking at their computers all the time or they just want to do something else. If you act as an example of making time for lighthearted interactions and human connection, your employees will feel much more relaxed. 8. Accept that remote work is here to stay, and be a positive force As of Oct. 2019, just months before the coronavirus outbreak, nearly 16% of the U.S. workforce (that’s more than 26 million people) was working remotely part of the time, according to the U.S. Bureau of Labor Statistics. For years, the data has shown that engagement, happiness, and health all improve for those who work from home: 2/3 of employers report greater productivity from remote workers as compared to on-premises workers. 75% of people say they are more productive as remote workers due to decreased distractions, according to the Flexjobs 2018 annual survey on telecommuting. 82% of remote workers feel less stress, according to a PGI study. Regardless, many traditional managers don’t it that way. There’s a camp that firmly believes that employees will slack off without direct supervision. Or that “work should be done at work.” However, months into our national remote-work experiment, so many of the myths about working from home have been dispelled. Many non-believers have been converted overnight. That’s good news because it looks like having a partially full-time remote staff will emerge as a new norm to reduce costs in a post-pandemic world. The takeaway is that remote work is going to continue to be a larger part of the fabric of work life. Embrace it. Understand it. Make it work for your organization. Let go of preconceived notions and be the champion of your remote work team. Model the behavior and attitudes that you want to see from your employees, and they will respond in kind. Final thoughts These tips only work if you apply them. Share them with your colleagues. And don’t forget to share our tips for employees with your team. Need help? Meanwhile, if your company needs help with the technology side, support, or a VPN set up in a day, our experts at infrastructure, cloud, and collaboration are activated and ready to help. Have questions? Contact us. We’re ready to help.
Cloud 101: Let’s remove the complexity
Whether you’re on the journey of exploring whether cloud is right for your organization or just trying figure out what the all the buzz is about, this article will break down the basics of cloud for you. What is cloud? The cloud is a method where you are provided IT resources as a service. This means servers, networks, platforms, and software. A provider takes ownership of all related resources and processes, and you pay them a monthly fee. What are the benefits of cloud? If the cloud is nothing more than servers being hosted for you, then why should people move to the cloud? After all, you’ve already made a heavy investment in your existing infrastructure. There shouldn’t be a need to switch over to having a cloud provider host your machines when you’ve already paid for them, right? The difference lies within the service that is being provided to you. Cloud providers bestow an easy-to-use, configurable way to spin up servers, networks, and storage on demand, quickly. They can be in charge of your hardware, your data, and all of the processes related to it (disaster recovery, maintenance, capacity management, etc.) so you have time to focus on your business. The important benefits that you gain when switching to have your servers and services hosted on the cloud are: Agility Scalability Reliability & Security Cost Agility Think about how much time it takes to provision a full application to your environment. You would first have to discover each tier of the application (web, application, data, etc.) and determine each hardware need. Assuming you don’t have the capacity to develop a large application, you would have to go through the request and procurement process to obtain the hardware. This requires you to very closely right-size the capital expenditure so you don’t overpay for the hardware. When it’s done, you’ll have to install the servers in the data center, deploy the virtualization software to the machine, and start to build the machines as stated in the specification. Now, even if your company is one of the few that diligently buys and over allocates hardware to compensate for future capacity trends, we’re still looking at an inability to rapidly provision servers when a business unit needs it. One of the characteristics that makes cloud so appealing is the ability to quickly provision on demand. No up-front investment, no additional labor cost. Simply select a server size, an operating system, and other configurations you’re looking for, and the cloud can create that server for you in minutes. Agility means time. And the less time you have to spend on workload tasks, the more time you can work on the more important things that deliver value to the business. Scalability Determining capacity remains an elusive task for many companies. The resources you need are predicted through trends of what has been used. The problem with predictions is that they are not always right. Even more important, it’s very hard to acquire additional hardware and resources using predictions. As a matter of fact, many infrastructures choose to downgrade servers rather than buy more hardware. The cloud cuts through this with (seemingly) unlimited resources that you can pool and use. Need 5,000 more web servers across the world for a few weeks to handle your Black Friday sales? No problem with cloud. The ability to never have to worry about maximum capacity and to focus on right-sizing existing resources becomes the de-facto standard with cloud. Keep in mind that when you scale you pay for more, but when you’re not using those resources, you don’t. Reliability and security If you don’t own your servers, and your data is no longer in house, how do you work through strategies like business continuity and disaster recovery? How do you control risk and cybersecurity? This boils down to trust. Cloud providers have spent a lot of time working to pass security and regulatory agencies, like PCI, HIPAA, and Sarbanes-Oxley. They also have a large amount of resources dedicated to keeping and securing data. After all, why would a provider go through all of the money they’ve spent on preparing the cloud for the biggest of businesses, only to get hacked and have countless lawsuits on their hands? With cloud providers, you own your data, it is secure and their disaster recovery service-level agreements (SLAs) are unmatched. Cost There are many benefits on the cost-side of cloud. We went ahead and wrote an additional article to dive deeper. Check out our article, The Cloud: From Cost to Profit Center. Types of cloud Wait, there’s more than one type of cloud? Yes, and each of them are enhancements to the last. We’ll describe them below: Software as a Service (SaaS) Platform as a Service (PaaS) Infrastructure as a Service (IaaS) On-premises (private cloud) Software as a service (SaaS) SaaS offerings are turnkey solutions in which a cloud provider gives you access to targeted application services. Your company is not required to install, configure, or deploy any infrastructure. The provider does all of it. For many organizations, SaaS represents the first foray into the cloud. With flagship SaaS products, such as Office 365 and Salesforce CRM, enterprise products that can offer true cloud solutions represent the majority of growth in the competitive cloud marketplace. For any enterprise looking to procure new enterprise applications, SaaS is generally the first stop on the journey. While potentially sacrificing some flexibility, organizations receive an operationalized subscription service that is available the moment you are ready. Platform as a service (PaaS) For organizations that leverage and build applications, PaaS cloud offerings give you the best of the cloud and the best of customized solutions. With PaaS, your development teams focus on one thing: developing the application solutions that add value to your business, and nothing that does not. Your team members can focus on code, data, and integration, and the infrastructure is left to the cloud provider. The advantage of PaaS is that your teams can get the flexibility of custom solutions with the infrastructure and management capabilities of the public cloud. PaaS applications are also uniquely valuable because they offer the following functionality: Unlimited, push-button scaling without deploying new code Transparent and independent infrastructure patching and updates Simplified distribution among many application servers Deployment capabilities that allow your company to deploy as many environments as you desire, pay for the temporary usage, and then scale back when the resources are not used Unified management portal and application performance management (APM) PaaS applications represent the greatest Total Cost of Ownership (TCO) and fastest time to market for custom solutions and are valuable for both the startup and enterprise. PaaS makes the development, testing, and deployment of applications simple and cost-effective. Infrastructure as a service (IaaS) In an IaaS environment, you’re simply moving your virtual machines from your data center to the cloud. The impact of doing so is that you will not have to worry about capacity or the cost of factors like cooling or equipment. You are not required to move all your machines to the cloud. In fact, most organizations select a combination of servers hosted in cloud and some in their own datacenter for moving to the cloud – and that’s what’s called hybrid cloud. Selecting which servers move is essentially a business decision based on considerations like cost, hardware lease renewals, or software licensing requirements. For example, if App A is running on old hardware that needs replaced, why not move it to the cloud instead of purchasing new hardware? Your server will still be the same as it was before, with the same configurations and settings. Nothing has changed except for where the server exists. On-Premises (private cloud) How is a private cloud (one where you own all the resources) different from a typical data center? A datacenter doesn’t provide those resources like a service, whereas a private cloud does, with key elements being automation and self-service capabilities. You can leverage your existing hardware investment to give your business the same experience as using a cloud provider. This still provides some of the business value and benefits as a true public cloud, but without a huge expenditure. That’s one main reason why a business would want to make their data center a private cloud. They’ve already spent money to have that expensive data center on-premises. They may want the benefits of the cloud, but they can’t throw those dollars out the window. The private cloud is the answer to gaining cloud benefits now, but not getting rid of your existing data center investment. As the equipment and licenses expire, the transition to a hybrid or public cloud makes more sense. You can transform your data center into a private cloud by meeting two goals: Make your datacenter capable of providing self-service so that staff can request servers, infrastructure, or any other IT request through a centralized solution, such as a portal Automate your datacenter so that servers, infrastructure, and other IT resources are provisioned, managed, and deprovisioned without IT intervention Automation and self-service create tremendous efficiencies for IT personnel, freeing up their time to innovate in order to drive business decisions rather than spending time on repetitive processes and firefighting. Conclusion Now you know a little more about cloud, in very real terms. That’s the first step to make a decision on what you can actually use the cloud for and when. Companies move to the cloud for agility in an ever-changing world. While some organizations are still swamped managing their infrastructure and IT, others have moved to the cloud and are able to focus on driving business. This provides an unmatched competitive edge, and with the cloud market continuing to grow in leaps and bounds, businesses that don’t stay current in this technology revolution will be hard-pressed to remain competitive. Cloud is a huge proposition, but Fusion has the expertise to help you get there. We’ll help you understand the best strategies that will work specifically for your business and lead the way to that big cloud in the sky with our team of architects and business professionals.
Get to know Dynamics 365 and its licensing
The risk of outdated ERP systems With a lack of IT support to maintain siloed and outdated ERPs, businesses end up paying more to maintain an already obsolete system. Increasing IT costs and reducing productivity ultimately causes companies to fall further behind their competitors. Microsoft’s release of Dynamics 365 packages their entire Dynamics ecosystem (AX or NAV, and CRM) into one enterprise solution, enabling access to modules you need, only when you need them. The new model also provides fine-tuned control over each user, ultimately increasing productivity. With Dynamics 365’s reduced startup and licensing costs and agile implementation of business processes, upgrading an ERP system has never before been so viable. What is Dynamics 365? With the release of Dynamics 365, Microsoft has managed to package their entire Dynamics ecosystem (AX or NAV, and CRM) into a compelling enterprise solution, delivering a suite of intelligent business applications. The main purpose of Dynamics 365 is to unify the capabilities of CRM and ERP. Dynamics 365 ships with several different applications: Operations (enterprise edition) Financials (business edition) Sales Customer Service Project Service Automation Field Service Microsoft PowerApps Microsoft Flow In addition to this cohesive platform, Dynamics 365 has a new SaaS-based licensing model that may appear confusing at first, but once understood, you will see that it provides fine-tuned control over each user and can help reduce overall business costs. Dynamics 365 versions Dynamics 365 ships with two editions. A business edition is optimized for smaller companies of 10-250 employees and contains Dynamics 365 for Financials. An enterprise edition is suited for businesses of 250+ employees and includes Dynamics 365 for Operations (formerly Dynamics AX). For the purposes of this article, we will focus on the Enterprise edition. How Dynamics is licensed Dynamics 365 can be accessed either externally or internally. An internal user utilizes the system through the Dynamics 365 graphical user interface (GUI). An external user (third party) accesses the system through other means, usually a custom, public-facing portal or an application programming interface (API). External users do not require a subscription license (SL), as it is included with all 365 subscriptions. Any internal user must have an SL tied to their Office 365 account. There are two different classifications of SLs, User, and Device. User SLs are assigned to named users and are managed through the Office 365 Portal. Device SLs are assigned to a specific device and are designed for shared login. Every user using the device does not need a login. For example, a POS or handheld warehouse mobile device would not need a login. A typical Dynamics 365 subscription contains a mix of both User SLs and Device SLs. User SLs are classified under two different types, Team Members (Light Users) and Full Users. Team Member SLs are for users who consume data or reports from Dynamics 365 and perform minimal tasks, such as time/expense entry and HR record updates. Team Member SLs cost significantly less and are preferred if a user falls under the criteria. Full User SLs are for “power users” who require full functionality. Plan vs. app licensing Full Users can be licensed on an application basis or as a plan. Plans bundle multiple Dynamics 365 applications together. Bundling the SL in a plan will provide more functionality and will is available at a reduced rate. There are two plans available, Enterprise Edition Plan 1 and Enterprise Edition Plan 2. Plan 1 includes Power Apps, Sales, Field Service, Customer Service and Project Service Automation. Plan 2 contains everything in Plan 1 plus Dynamics 365 for Operations. Tiered pricing Confused yet? Now let’s introduce tiered pricing. Microsoft offers a reduced rate for customers who purchase a large amount of user SLs. Tiered pricing rates are only available to Enterprise Edition Plan 1 SLs, for both full users and team members. Plan 2 does not qualify for tiered pricing. To make matters more confusing, the Plan 1 and Plan 2 seats both count toward Plan 1 tier qualification. Below is the breakdown of the different Enterprise edition tiers: Tier 1: 0–99 users Tier 2: 100–249 users Tier 3: 250–499 users Tier 4: 500–999 users Tier 5: 1,000+ users Note that tiered pricing is not available to customers purchasing licenses through Microsoft Online Subscription Program or for academic and charity programs. Default subscription environments A set of standard instances comes with every subscription of Dynamics 365 by default. Additional storage and instances can be purchased if needed. It is worth noting that Plan 1 Business Apps and Dynamics 365 for Operations come with a set of distinct environments and use a different tenant. Plan 1 Business Apps use the same tenant and infrastructure while Dynamics 365 for Operations leverages a different tenant. Plan 1 business applications default environments By default, every Plan 1 contains a production instance, non-production instance, Dynamics 365 Admin Portal, and 10 GB of database storage. The database storage increases based on the number of users using the system. Storage is increased at a rate of 5 GB per every additional 20 full users. The chart below shows the rate of growth as the users increase. Operations default environments Dynamics 365 for Operations operates on a different tenant from that of the Plan 1 Business Apps. If Plan 2 is purchased, the customer will obtain access to Plan 1 Business Apps and the Dynamics 365 for Operations App. By default, every Operations App contains a production instance, two non-production instances, and default storage capacity. The production instance comes with disaster recovery and high availability and is monitored around the clock by Microsoft for overall health and availability. The first non-production instance is a tier 1, all-in-one sandbox environment. This environment requires the purchase of database storage, as it doesn’t come with any by default. The tier 1 instance is also referred to as a developer instance and is provided for the life of a subscription. The second non-production instance is a tier 2 standard acceptance testing instance. This instance is a multibox instance and includes 10 GB of default storage. If more instances are required, they can be purchased at a monthly cost. Dynamics 365 for Operations comes with 10 GB of database storage. Additional storage is accumulated as the number of users increases. For every additional 20 full users, database storage is increased by 5 GB. Every subscription additionally receives 100 GB of Azure BLOB storage. Fusion knows Dynamics 365 As a Microsoft Gold Partner, we can help you grow and evolve with Microsoft business solutions. We specialize in Dynamics CRM & ERP, Office 365, and Azure cloud services, covering everything from licensing and migration to ongoing support and infrastructure. Want to talk? Let us know.
SEO measurement: Beyond the rankings
One of the most difficult aspects of search engine optimization is measuring the success of a campaign. Historically, SEO providers give their clients a list of “valuable” keywords and the position of the client’s website for each keyword. Unfortunately, this kind of ranking data provides a comparatively narrow view of search engine activity because it only measures expected results. In order to get maximum insight into an SEO campaign, key performance indicators (KPIs) need to be established to capture the complete picture. Keeping score Search engines include more than 300 data points when they are calculating the score or rank of a page. Measuring how your site, page, or campaign will perform based on a specific request is a very challenging problem. A good starting point is to begin recording and measuring a variety of on-site and offsite indicators in order to develop a custom SEO solution based on the competitive landscape. You’ll need to test and develop metrics that provide relevant insight into your site’s ability to rank versus competing sites currently ranking for targeted traffic. On-site measurements On-site measurement begins by scoring the content of your website based on quantity, quality, and structure: The quantity is the number of pages of unique content and the rate that new pages are added to the website The quality score is more subjective, but relates to the relevance of the content to questions being asked by the target demographic The structure score looks at items like URLs and HTML tags to determine the ease in which the content could be indexed All of these measures are then combined into a content score, which is compared to top-ranking competitors. The content scoring also identifies gaps in subject matter and opportunities for new topics. User experience is scored by measuring bounce rate, pages per visit, and time on your site. This data is evaluated on a device-type basis in order to make sure that all visitors have a similar experience. It is also important to understand your site’s performance by checking PageSpeed and Yslow scores. In addition, the actual response time of each page and supporting assets determine a speed score. Offsite measurements Search engines rely extensively on outside factors to determine website relevance. Measuring several leading metrics to identify potential opportunities can expand the online reach of your site. Begin by evaluating incoming links to the website (follow and no-follow) to determine the number and quality of external websites linking to your site. During the backlink analysis, links that should be disavowed because of potential penalties related to Google Panda must be identified for future action. Finally, check your social media activity to determine what content is being shared and/or discussed and the overall reach of your site on social media platforms. Business factors In order to measure the return on investment (ROI) for any online marketing activity, it is important to have well-defined goals for your website and visitors. Start by developing a value for each type of conversion. On an e-commerce site, it is very easy to define the value because the visitor has put items into a shopping cart and either completed the purchase or abandoned the cart. If your website supports a large brand or collects leads, the definition of value is more difficult. However, value always exists and it is important to agree on a value in order to report on the business success from web activities. Once the business goals are defined, you’ll need a method for segmenting different online marketing activities so each channel can be measured independently. SEO traffic can be defined as a website visitor who arrives from organic search results and didn’t include a brand name in the search terms. However, with Google’s update to secure search, the availability of keyword data has been reduced significantly and the exclusion of brand name keywords has become more difficult. In order to compensate for the lack of keyword data, you should now consider an SEO visitor as any visitor who enters your site from organic search with a landing page that is not the home page (and possibly a few other pages). This is when the fun begins. You can now measure visitor traffic from SEO and compare it to other marketing activities. The ability to show a financial return on SEO is possibly the most important factor to business stakeholders and executives. Measuring the success of your campaigns with KPIs Using the metrics outlined above can provide a clear picture of where SEO effort is being applied and how it impacts your business financially. The reporting also identifies successful strategies that you can expand on based on your KPIs. It is important to remember that measurement and execution of an SEO campaign never ends. Search engines are testing changes to their ranking algorithms on a daily basis. New content, links, and social media content activity are also constantly in flux, and without continuous monitoring even successful SEO campaigns may fail if they are discontinued.
The buzz about artificial intelligence, machine learning and deep learning: An overview
The B2B and B2C markets are abuzz with the terms artificial intelligence, machine learning, and deep learning. But what do these terms mean, exactly? They’re often used very loosely and you may think they’re interchangeable. But they’re not. Here’s a short overview of artificial intelligence, machine learning, and deep learning to help you cut through the static to determine which solution is right for you and your business. Working definitions Artificial intelligence (AI) is the term for the broad discipline that includes anything related to developing machines that are “intelligent” through programming. This includes many daily items you’re familiar with, from smartphones and marketing software to chatbots and virtual assistants. Machine learning (ML) refers to machines and systems that can learn from “experience” supplied by data and algorithms. ML is often used interchangeably with AI, but it’s not the same thing — ML is a developmental outgrowth of AI. Deep learning (DL) is a further developmental outgrowth of ML, but applied to even larger data sets. It uses multi-layered artificial neural networks to deliver high accuracy in assigned tasks. In terms of historical development, AI came first. It serves as the foundational discipline from which ML evolved. And ML is the foundational discipline from which DL evolved. One way to conceptualize their relationship to one another is as nested arenas of AI development along a timeline: Artificial intelligence overview and how it works In its broadest sense, AI refers to machines programmed to act according to well-defined rules and responses. The responses are confined to the set of rules that are provided, and the machines can’t deviate from those rules, except if they fail. A very basic example of AI would be your clothes dryer. You can set a specific time and temperature, and the machine performs the task according to the instructions given. It doesn’t have the ability to make decisions or make any changes by itself. A more sophisticated example would be configuring your CMS to deliver personalized website experiences. By analyzing a targeted selection of data points about your customer and writing the appropriate logic, your website can display the most relevant content. In neither case is the machine capable of being more than its programming — even if that programming makes the machine very capable in accomplishing its assigned tasks. Machine learning overview and how it works “ML is the science of getting computers to act without being explicitly programmed.” Stanford University Machine learning is a different approach to developing artificial intelligence. Instead of hand-coding a specific set of rules to accomplish a particular task, in ML the machine is “trained” using large amounts of data and algorithms that give it the ability to learn how to perform a task. Over the years, the algorithmic approaches within ML have included and evolved from decision tree learning, inductive logic programming, linear or logistic regressions, clustering, reinforcement learning, and Bayesian networks. Currently, there are three general models of learning used in machine learning: Supervised learning Unsupervised learning Reinforcement learning Supervised learning Right now, most machine learning is supervised, which still requires a lot of human intervention to accomplish the training. For example, a “supervisor” has to manually tell the spam filter what to look for in spam vs. non-spam messages (e.g. look for the words “Western Union” or look for links to suspicious websites, etc.) until the machine has gained enough “experience” to learn and accurately apply the distinctions. The training goes something like this. The algorithm would be first trained with an available input data set of millions of emails that are already tagged with the spam/not spam classifications to train the ML system on the characteristics or parameters of the ‘spam’ email and distinguish it from those of ‘not spam’ emails. Unsupervised learning In general, unsupervised learning is more difficult to implement than supervised learning. But it’s very useful when your example data set has no known answers and you’re searching for hidden patterns. The system has to train itself from the data set provided. Two popular types of unsupervised learning are clustering and association. Clustering groups similar things together and consists of dividing a set of elements of the existing data set into groups according to a heretofore unknown pattern. For example, defining a customer demographic and further clustering them based on education or income might affect their purchasing decisions in favor of one product or another. This would allow you to specifically target each cluster of customers more effectively. Association involves uncovering the exact rules that will describe the larger portions of your data. For instance, “People who buy X also tend to buy Y.” For example, online book or movie recommendations are based on the association rules uncovered from your previous purchases or searches. Association algorithms are also used for purchasing-cart analysis. Given enough carts, the association technique can help predict another item you might like to put into your cart. Reinforcement learning The machine learning system learns by trial and error through training characterized by receiving virtual rewards or punishments. Reinforcement learning comes from child-development research that says instead of telling a child which piece of clothing to put into which drawer, you reward a child with a smile when the child makes the right choice on their own or you make a sad face when the child makes the wrong choice. After just a few iterations a child learns which clothes need to go into which drawer. In reinforcement learning, very complex algorithms are designed so that the machine tries to find an optimal solution. It operates according to the principle of reward and punishment, and by this approach, it moves quickly through several mistakes or near mistakes to the correct result by adjusting the weight of the previous results against the desired outcome. This allows the machine to make a different, better decision each time until it is rewarded. Deep learning overview and how it works Deep learning is the newest area of ML and AI that uses multi-layered artificial neural networks to accomplish tasks such as object detection, speech recognition, and language translation − all with an extremely high degree of accuracy. The artificial neural networks (ANN) are inspired by the biology of the human brain, specifically the organic interconnections between neurons. The human brain analyzes information it receives and identifies it via neuron connections according to past information it has stored in memory. The brain does this by labeling and assigning information to various groups, and it does this in nanoseconds. Similarly, when a system receives an input, the deep learning algorithms train the artificial neurons to identify patterns and classify information to produce the desired output. But, unlike the human brain, artificial neural networks operate via discrete layers, connections, and directions of data propagation. Despite the level of sophistication of its algorithms, DL is still just another method of statistical learning that extracts features or attributes from raw data sets. The major difference between deep learning and machine learning is that in the latter you need to provide the features manually. DL algorithms, on the other hand, automatically extract features for classification. This ability requires a huge amount of data to train the algorithms. The accuracy of the output depends on the amount of data, and deep learning requires huge data sets. Additionally, due to the sophisticated algorithms, deep learning requires very powerful computational resources. These are specially designed, usually cloud-based computers with high-performance CPUs or GPUs. There are several kinds of artificial neural networks and DL processing applications you may have already heard of: Convolutional neural networks (CNN) are deep artificial neural networks that are used to classify images, cluster them by similarity, and perform object recognition. These are algorithms that can identify faces, tumors, and navigate self-driving cars. Source: TowardsDataScience Generative adversarial networks (GAN) are composed of two neural networks: a generative network and a discriminative network. GANs are very popular in social media. If you feed the GAN with a large enough data set of faces, it can create completely new faces that are very realistic but nevertheless fake. Natural language processing (NLP) is the ability to analyze, understand, and generate human language, whether text or speech. Alexa, Siri, Cortana, and Google Assistant all use NLP engines. Putting AI, ML, and DL to work for you What most of us think of as AI is, more accurately, machine learning. But understanding the history, development, and distinctions between artificial intelligence, machine learning, and deep learning can help you determine which solution would be right for your goals. The solution you choose, however, is also dependent on the amount and type of data you have access to. Within the last couple of years, almost every company is using machine learning or deep learning (and therefore, by definition, artificial intelligence) in some capacity to move their business forward. The competitive gauntlet has been thrown down. Fortunately, tools that were previously only available to enterprise-size companies are now affordable and accessible to mid-market companies, making machine learning the most accessible playground right now. Fusion Alliance provides cloud infrastructure and other ML services that accelerate machine learning modeling, training, and testing to our banking, financial, and retail customers. Read more about our work in AI, ML, and DL: 3 fusion experts share their machine learning secrets Unlock customer credit insights with machine learning How Donatos uses machine learning to retain customers Conversational marketing and machine learning are shaping the future of retail
Automated UI tests: Taming the tangle
Ever tried to detangle a big box full of wires and cables to get to that one power cable? It often seems that no matter how much care we put into placing those cables into the box, we inevitably end up with a tangled mess that tests both the physical integrity of the box and our patience. If you feel (or are starting to feel) that your automated UI testing efforts are like a box full of cables, then this post is for you. Some initial thoughts If you are coming from the preceding post, Automated UI Tests: Taking the Plunge, then you probably have a solid notion of what we’re going to express here. The ideas and advice are not designed to be a definitive list of items that, if followed, guarantee success in automated UI testing conversion. Part of what makes the world of software development exciting is that every situation is unique and presents its own challenges. Let’s help get your creative juices flowing and provide the right context that can get you on a confident path of automated UI testing, with your own artifacts and processes. Let’s start with flaky tests If you are familiar at all with testing environments, you know which flaky tests we’re talking about. Sometimes they pass, sometimes they fail. There doesn’t seem to be a rhyme or reason for the pass/fail, and you find yourself holding your breath every time a build is triggered. Flaky tests can be toxic to your efforts in several ways, but the most direct impact happens when the people responsible for that app begin to lose confidence in the automated tests. Angie Jones delivered a fantastic talk at SauceCon 2017, full of strategies and conversation-starters surrounding this topic of the flaky test. The bottom line: don’t allow these to fester. The first steps are to isolate them and move them to another branch so that they stop poisoning a build that otherwise provides consistently valuable information. Where the wild browsers roam It can feel like graduation day when you’ve just finished a batch of automated tests; you have finally completed a phase of the project, and it’s ready to run for the whole world to see. Then the reality of browser/device coverage sets in, and the companion-reality of run time rears its ugly head. For example, in the context of a web app, it is typically expected you have the ability to execute against at least the most current versions of the Big Four (Chrome, Safari, Firefox, and IE). You get to a point where spinning up virtual machines on your development box feels slow and inevitably hijacks your computer, making it difficult to continue working during a build. Luckily, there are a number of solutions out there to address this problem. Cloud providers such as SauceLabs and Browserstack will work for some situations, while a hosted solution like Element34’s Selenium Box will work for others. If your needs are not horribly extensive, you could also look into standing up your own Grid. A word of advice: you might find that if you developed all of your tests against one browser, then it may not behave as expected on the others. This could (and very well may) be its own post in the future. Keep in mind that you may need to add branching logic depending on the browser or device in question. Additionally, the sentiments from the previous post are relevant here: if you are embarking on a new idea for multiple browser/device/host support, start with a small subset of your tests. It will be easier to work out the kinks of your brand-new Grid with a limited number of tests that can finish in a few minutes rather than testing your whole suite. What’s in a framework? “Framework” is one of those words in our industry that can mean radically different things to different people. We’re referring to it here as a repeatable, generic solution that helps you quickly bootstrap new projects. No matter which technology and tools stack you are using, there are going to be ways to cook up little bits of the process that are transferable. Thoughts toward this method usually come around the same time you want to start adding automated UI coverage to the next app on your list. A good number of the frameworks we’ve built and seen hit these major feature points: Handle input and desired properties Do you want to be able to specify things like the browser, host operating system, versions, additional app binaries, and properties from an external configuration file? What would you prefer to be command-line arguments? What about external data files? Manage the object that controls the browser, device, etc. This is the WebDriver object in Selenium WebDriver world. How do you get this object and fire up the application? Do you want to abstract it behind a factory? Do you need to proxy any of the behavior to start and stop the browser, emulator, or simulator? Establish your reporting features of choice How are you currently digesting the output that your builds produce? Are the mechanisms that produce that output tightly coupled to the tests of a particular app? How would you abstract those reporting mechanisms so that they can be used for any app? If you’re entertaining different options here, give a look. Include examples What does a typical test flow look like? If you’re using page objects, what would a simple one include? This is not only useful for newcomers to the automated testing effort, but also serves as a good, executable reminder when starting a fresh project. Closing remarks Considering there isn’t a silver bullet solution to most of the problems in software development, detangling your own box of cables and wires doesn’t have a surefire checklist for success. At the very least, take comfort in the fact that you are not alone, and you can use the advice above to help get the wire-wrangling process started.
Customer experience: What it is and what it isn’t
What is customer experience? With the experience economy taking a stronger hold, it is imperative for organizations, regardless of industry, to understand what “customer experience” means. This phrase has been defined in many ways, but our definition aims to explain the scope of customer experience: Customer experience: the interactions between an organization and a customer over the duration of the relationship, and the customer’s perception of their engagement and the supplied products or services. To effectively address the entire scope of the customer experience, organizations need a 360-degree view of the customer. The very core of a customer experience strategy (CX strategy) is understanding the behaviors, needs and wants of the customer, posing the question, “What does a great customer experience mean in the eyes of the customer?” According to survey conducted by the CMO Council, a great customer experience includes: Quick response times to customer requests or complaints Rapid response to issues and challenges Products that reflect their own needs and wants Consistency of the experience across all touchpoints This customer viewpoint has created a divide in the experience economy between businesses that quickly address customers’ changing behaviors and those who do not. Implementing a CX strategy is a necessity to thrive in the experience economy Today’s customers exhibit unprecedented new behaviors. Incumbent companies suffer without a 360-degree view of their customers or a more flexible and nimble framework to accommodate new behaviors, falling further behind competitors. Slow improvements to increasing customer demands widen the gap between business goals and customer needs. In response to the emergence of the experience economy, influencers in their respective industries are leveraging digital technologies to become more customer centric. These forward-thinking players are breaking down the silos between departments and operatives to unify employees in delivering experiences that match or exceed those of industry leaders. As a result, businesses can no longer use traditional business models to preserve customer loyalty. It’s not enough to simply know adapting to the experience economy is a necessity. Businesses need to take the next step and actually invest in a new strategy to fully embrace the marketplace shift. Investing in a customer experience strategy is not extravagant In response to the experience economy, the roles of CFOs and CIOs are changing, shifting focus from reducing costs to enabling organizations to become more flexible and scalable to grow revenue. Customer experience spending has become the top investment priority for accelerating businesses, and such investments are paying off. For a company with $1 billion in annual revenue, a moderate increase in CX generates an average revenue increase of $823 million over three years. The saying, “What’s good for the customer is good for business,” is truer than ever. Providing positive customer experiences is directly correlated with customers who purchase again, don’t switch to competitors and recommend the company, resulting in loyalty-based revenue and evident ROI. Investing in the right digital strategy and technology ecosystem creates more access and engagement with brands, increasing opportunities for a better customer experience. These technologies are being used implement CX strategy by converging people, processes, and technology. Companies have linked CX improvements through digital to reductions in service, acquisition, processing, and engineering costs. Forward-thinking companies like Airbnb and Tesla Motors are great examples of this, using new digital tools and systems to add immense value and gain a competitive edge. Customer-centric strategy also translates into an investment in employees. Employees are happiest when they feel like they have a purpose, and this is especially true for millennials, who have surpassed Generation X as the largest share of the American workforce. A unified aim to provide great customer experience gives employees a purpose, ultimately lowering employee turnover, while adding value and productivity to business process. Where to start An effective CX strategy means mastering engagement opportunities with new customers and nurturing existing customer relationships. Customer journey mapping uncovers what your customers need and when they need it along their end-to-end journey. Misunderstandings of customer behaviors and preferences, along with low adoption of the evolving digital technology, serve as barriers to an effective CX strategy. Customer journey mapping breaks down these barriers to help organizations design, develop, and deliver experiences that meet or exceed customer expectations, all while aligning to the business strategy by shifting the focus from immediate sales to delivering value to customers. Now that you know what customer experience is, why it is needed, and where to start, your business can begin gearing up for the transition into the experience economy and gain the competitive edge required to thrive. Need help getting started? Let us know.
Omnichannel 101: What it means for your digital strategy
Born in the retail industry, omnichannel isn’t just a buzzword to add to your marketing “bingo” card. (You know the one.) Omnichannel strategies provide digital customers with a seamless experience of a company’s brand, no matter where or how those customers engage with it. From brick-and-mortar stores to social media and websites — and from telephone to laptop and mobile device — omnichannel provides customers with consistent, integrated service. More than that, omnichannel better enables your company to meet your customers where they live. In a July 2014 Forbes article, Daniel Newman defined omnichannel as “. . . a reflection of the choice that consumers have in how they engage a brand, and therefore is best represented as how brands enable their clients and consumers to use these channels to engage with them.” That hasn’t changed. What is valuable about Newman’s definition is that it focuses on consumer choice, placing the customer journey squarely at the center of the strategy. Omnichannel, at its most faithful execution, is a transition from transactional, forced interactions with brands to a system of continuous, in-context enablement from the brand. Newman’s definition also widens the focus of the approach to include both products (retail) and services. How is omnichannel different? Offering products and services across multiple channels — and accessible across multiple devices — is nothing new. Multichannel marketing, or offering brand experiences in many channels for many devices, has been evolving for some time. Margaret Rouse of WhatIs.com provides perhaps the simplest explanation of the difference between multichannel and omnichannel marketing. “What distinguishes the omnichannel customer experience from the multichannel customer experience is that there is true integration between channels on the back end.” The distinction is an important one, both for marketing and for IT, because integration implies big data and the business intelligence that can be gained from it. The reality is that we are building systems and processes that use data from line-of-business (LOB) systems: Customer interactions on web, mobile, and social Brand ambassadors like customer service and sales Behaviors in brick-and-mortar stores where consumer products are the focus Integrating the systems used to manage marketing content puts customer data at the center of the brand interaction, which is what makes a seamless customer experience possible. What omnichannel is not With this basic understanding of what omnichannel marketing is, let’s talk about what it is not because that has major implications for your organization and its digital strategy. Omnichannel is not a traditional marketing funnel The traditional marketing funnel moves prospective customers through several well-defined, brand-controlled phases, from brand awareness to purchase and, if you’re really lucky, to brand advocacy and repeat business. By contrast, omnichannel is a continuous cycle, where purchasing can occur at any point during the customer’s interaction with the brand, as well as across devices. And not all of the content with which the consumer engages (or, for that matter, which the consumer seeks out) is brand-controlled. User reviews, social networks, and other consumer-driven content all influence purchasing decisions at various points in the customer journey. What this means for you is that you not only have to make the right content available at multiple consumer touchpoints, but you also have to be aware of the content you don’t control so that you are ready to respond to it, indirectly or directly. Analytics become especially important so you can track your customers’ journeys and also help predict where you can make the most impact on their buying or engagement decisions. Omnichannel is not conducted in silos Omnichannel is a marriage of technology and marketing. A true omnichannel experience requires collaboration and cooperation between marketing and IT. Neither division can dictate to the other what will work, which technologies to use, how to use them, or when to implement. Both marketing and IT need to be sensitive to each other’s schedules, initiatives, and resource constraints. What this equates to is organizational change. Omnichannel isn’t a project; it’s a way of operating your business intelligently and dynamically. Pursuing an omnichannel approach to marketing requires you to commit to organizational change. Omnichannel is not a switch you flip on — tah-dah! The crucial integration of systems that differentiates omnichannel marketing from multichannel marketing often requires updating back-end systems. And that’s neither quick nor easy. Omnichannel marketing requires careful planning, along with cooperation and collaboration between marketing and IT. An omnichannel solution doesn’t have to be expensive. But it does need to be well thought out. You have to understand that changes to how data is managed do not happen overnight. Changes to how content is written and published do not happen overnight. Changes to how and where you interact with your customers do not happen overnight. But when they do happen, there are efficiencies to be gained, money to be made (or saved), and loyalty to be won — both from your external consumers and from your internal producers. In short, an omnichannel strategy can completely transform the way you do business. Today consumers have their choice of content, devices, and touchpoints. Your digital strategy must place your customers’ journey squarely at the center. Check out “Vectren moves to omnichannel to improve brand experience” to find out why.
Legacy application modernization: A phased approach for moving to the cloud
Cloud modernization is not new. According to a recent survey, over 51% of organizations who responded are in the midst of application transformation and cloud migration. Many businesses have moved at least somewhat to the cloud. However, moving custom applications off-premises remains a holdover in many organizations. If your organization is experiencing any of the following pain points . . . Your competitors are growing faster than you can because of your legacy software and speed of execution Your on-premises, custom applications can’t scale to meet increasing demand Your customers are asking for features that you can’t add quickly Reliability, security, or performance post a significant challenge . . . then now may be a good time to reconsider modernizing your aging, custom software. But how do you identify the benefits and justify the cost of rewriting these applications? How do you guarantee that newer equals better and that you haven’t dug your organization into a hole? This article presents a more practical approach to application modernization by selectively identifying the pain points of existing applications and prioritizing a phased approach using specific cloud technologies. All of this gives you a clearer return on investment. It’s an approach we’ve successfully used with clients to develop cloud modernization project plans. Traditional approach to modernization Many application rewrite efforts are born out of statements like: We should be using the latest version of this framework. We should be leveraging microservices. We should be using the cloud. Ask why to any of these questions, and you will get a myriad of responses, most of which are justified and based on industry best practices. But it is not good enough to simply make these statements in isolation. We must always review the software in context and make the best decision possible for that application. Only then will it be clear which framework is the right choice, which cloud services to leverage, and which features are best suited for standalone microservices. Most discussions about cloud migration or modernization of this application will consider the entire application, which is a mistake. Figure 1 shows a hypothetical application broken up into four modules with four features in each module. Microservice patterns were designed to make the best architectural decision for the smallest unit possible. As you can see, this highlights our core pitfall. When discussing app modernization, always review each feature independently. Often in such a review, you will find most of the application is meeting the needs of your business, but certain other areas are struggling. Those are the features that should be prioritized and discussed in detail to ensure any modernization addresses the areas of need. Figure 1 A more informed approach to modernization Figure 2 shows the same hypothetical application with several “hot spots” – areas of pain within the current application. After isolating issues in specific features, it is much easier to make informed decisions. Figure 2 Such analysis will often result in one of the following outcomes: A phased approach to modernization – By addressing features of specific concern now, your business can reduce the scope of the remaining migration and achieve the end goal in more granular phases, providing incremental business value along the way. Improve the outcome of an existing app modernization – Even if your organization is down the path of re-architecting an existing application, this process is still worthwhile. You may find your engineering team is focused heavily on the use of bleeding-edge technologies without considering whether the technology is the right tool for the job based on the business need. Avoid a long, costly rewrite altogether – The existing application can take advantage of new technologies while existing features that work well can remain in place. The benefits of the phased approach Our experience designing app modernization plans tells us that your application modernization initiative will have a greater chance of success when you use the second, more deliberate approach to app modernization. This approach can reduce costs by only rewriting what needs to be optimized rather than trying to rewrite and rework the entire application. In this article, we presented an alternative to costly re-architecture projects that don’t seek to understand the business challenges upfront. Making well-informed business decisions and understanding and correlating those decisions to the right technology is critical to differentiating your business from your competition. If you have questions about how to implement this approach in your organization, don’t hesitate to reach out. Click here to discover how we helped McGraw-Hill Education meet the demand for digital education experiences by implementing a state-of-the-art, adaptive learning system. Related Articles: Cloud 101 5 Keys to Planning a Successful Cloud Implementation Which Public Cloud Offering is Right for You
3 Fusion experts share their secrets for machine learning in banking
In the quest to solve its most pressing challenges, the banking industry is being transformed by its adoption of artificial intelligence (AI) and machine learning (ML). Financial institutions are under pressure to better understand their customers, drive a more personalized customer experience, acquire new business, forecast risk, prevent fraud, comply with increasing regulations, improve processes . . . the list goes on and on. Most banks continue to use traditional, expensive analytics tools to tackle these challenges, but they struggle to keep pace with demands, and the tools are difficult to maintain. Machine learning relies on statistical and artificial intelligence approaches to rapidly uncover patterns in complex data, patterns that can’t be discovered through traditional tools. The impact of machine learning in banking While adoption of machine learning in finance is in the early stages, institutions who have leveraged this secret sauce are finding it to be a differentiator. For example, a large regional bank leveraged ML to predict institutional customers’ likely deposits on a daily basis, freeing $40 million in excess cash reserves. Another institution, credit union service organization Primary Financial Company, used ML to synthesize financial and competitive data to price securities, identify buyers, and project trade profitability. PFC can now ascertain with over 80% accuracy and 70% precision the likelihood of a particular investor to buy a given investment. For these companies, their early ventures into ML have certainly moved the needle on what they can accomplish. We spoke with three artificial intelligence and machine learning experts at Fusion Alliance to tap into their experience with banks, learn where the market is headed, and get answers to some common questions. Q: What do you see as the 2020 trends in machine learning for banks and credit unions? A – John Dages: 2020 is the year where we see machine learning become more democratized. Historically, machine learning engagements have required substantial data science and model training investments. However, the major ML platforms are evolving and providing advanced automated machine learning and feature analysis toolchains, lowering the barrier of entry for ML projects. Our team is also actively monitoring new “explainability” techniques to add deeper transparency for ML-based predictions and insights. Historically, the black-box nature of some ML algorithms (specifically deep neural networks) makes it difficult to relate to business principles. Ideally, these emerging techniques will increase confidence in ML models early in their lifecycle. In the banking sector, we have seen a great deal of capital chase trading and investments, but we are also seeing ML flow into loan operations, cash management, and general risk. A – Sajith Wanigasinghe: Machine learning applied to fraud detection is a major trend. Artificial intelligence is beneficial here because ML algorithms can analyze millions of data points to detect fraudulent transactions that would tend to go unnoticed by humans. At the same time, ML helps improve the precision of real-time approvals and reduces the number of false rejections Another leading trend is using robo advisors for portfolio management. Robo advisors are algorithms built to calibrate a financial portfolio to the user’s goals and risk tolerance. Chatbots and robo advisors powered by natural language processing (NLP) and ML algorithms have become powerful tools with which to provide a personalized, conversational, and natural experience to users in different domains. A – Patrick Carfrey: Personalized delivery of banking services is going to improve in 2020. New products are entering the marketplace that enable consumer and commercial bank customers to receive relevant account information in real-time, at the grain and timeliness that customers want. Q: What is your favorite machine learning use case for banks right now? A – John Dages: Machine learning will change the way banks see credit risk. FICO and the five C’s of credit are limited in features, captive to three agencies, potentially biased, and outmoded. The models we are building will allow lenders to view a complete picture of a borrower, offering customized predictions on creditworthiness. The banks that adopt this model will see an increase in lending opportunities while better understanding the liabilities on the balance sheet. A – Sajith Wanigasinghe: Customer lifetime is my favorite use case, where we can predict how valuable would a customer be within X number of years so that the bank can establish a good relationship with the customer in the early stages. A – Patrick Carfrey: Remarketing/cross-selling is a powerful option for banks right now. Given all the customer data that banks own, including deposits, transactions, and more, ML can tell if a customer is a good target for a new product in the bank’s portfolio. This is especially relevant as customers are expecting more. Being able to predict customer needs supports that need. Related Article: 4 ways banks can leverage the power of machine learning Q: What is the one machine learning data tool you can’t live without? A – John Dages: Excel. Sure, the enterprise data tools are highly capable (and the team spends a lot of time there), but the ability to quickly navigate data, perform simple transforms, and share data with a tool everyone knows is critical. I can’t remember a project where we didn’t get exemptions to install Excel in the banks’ datacenters. A – Sajith Wanigasinghe: TensorFlow framework would be one of the tools that I can’t live without because it’s the number one framework that I use every day and in 99% of our projects. TensorFlow is an open-source machine learning library which helps you to develop your ML models. The Google team developed it, and it has a flexible scheme of tools, libraries, and resources that allow me to build and deploy machine learning applications. A – Patrick Carfrey: TensorBoard. This is TensorFlow’s visualization toolkit, and it provides a nice visual interface for tracing key metrics through the model training pipeline. Deep learning models can get complex quickly, and being able to explore a model outside of the command line is nice. Clients love the graphs, too! Q: What are the biggest machine learning myths you wish more people understood? A – John Dages: For those beginning to develop an AI/ML center of excellence, there is going to be a gravity to focus on the cutting edge (deep learning, cognitive science, others). While there is obviously value there, there are a multitude of “traditional” machine learning practices and algorithms and that are lower complexity. A deep neural network should be the last resort, not the first option! A – Sajith Wanigasinghe: That machine learning and AI will replace humans. In fact, machine learning and AI will help you do your job much faster and better, and enable you to focus on the satisfying and important human elements of your role — including creativity and strategy. Think of machine learning and AI in terms of a tool, not a replacement for humans. A – Patrick Carfrey: For every machine learning project I’ve delivered, our clients will inevitably ask “We love the model, but can you tell us more about how the model is making the predictions?” This is a surprisingly challenging question to answer, particularly for black-box neural networks. Fusion has a variety of techniques to provide additional details, but they aren’t necessarily directly correlated to the actual model we’ve developed. If it is insights you seek, not decisions, consider business intelligence tools and processes in lieu of machine learning. There is room for both! Meet our panel of experts: John Dages With 15+ years of technology leadership experience, John brings a unique perspective to companies on their advanced analytics journey. He led numerous machine learning initiatives for large enterprises across industries. Those projects range from customer acquisition and retention to securities pricing and trade analytics. John’s background in application development, analytics, systems integration, and I&O helps him formulate how businesses can use data to drive competitive advantage and engineer true intellectual property. Sajith Wanigasinghe Sajith is an expert in machine learning, artificial intelligence, and enterprise-wide, web-based application development. He applies his experience and insights to help enterprises identify and solve challenges across the business that are ideal for machine learning. Sajith led teams that have revolutionized the financial, insurance, food, and retail industries by introducing advanced, intelligent forecasting systems that are powered by machine learning and artificial intelligence. He holds a B.S. in computer science from Franklin University. Patrick Carfrey Patrick joined Fusion Alliance over six years ago, leading a variety of application development initiatives for a flagship Fortune 500 client. Patrick is a firm believer that software is social, choosing to spend as much time in front of end users to build the best possible product. In that capacity, Patrick has developed and deployed practical machine learning solutions to help better understand and predict customer behavior to drive maximum engagement. He is the Competency Lead of Java at Fusion and holds a B.S. in computer science and engineering from The Ohio State University.
Ready to talk?
Let us know how we can help you out, and one of our experts will be in touch right away.