On the journey from data to analysis to insight, companies are shifting from a traditional approach and leaping forward into new ways of delivering actionable business intelligence. While the core goals remain the same — enabling data-driven decisions, optimizing cost efficiencies, and driving revenue growth — new tactics demand new skills. The pivot to a use-case model and good governance throughout the data lifecycle meets those challenges while also delivering faster time to insight. Connecting the business to fit-for-purpose data The new data mindset is purpose-driven. Based on specific use-cases generated by the business, today’s data teams build, deploy, and configure purpose-built data assets that meet the organization’s needs fast. This streamlined process represents a significant shift from the status quo for traditional data teams, but the streamlined workflow pays off. To generate fit-for-purpose data, start here: Solicit use cases from the business Understand and analyze the characteristics and dynamics of the use case Assess your existing data portfolio and identify information that might meet the need Consider the appropriate technology to synthesize data sets and deliver actionable insights Establishing a data asset creation workflow pays off in efficiency and value for IT and the business units involved. Learn more about how to develop data as an asset >> Speeding time to insight Traditional data warehousing models impose a high cost for integrating disparate data sets. A legacy workflow might include: Amending data architecture Creating a semantic model Time-consuming extract, transform, and load (ETL) processes for all data sets involved Preparing the data Making the data available for analysis Today’s businesses don’t have that long to wait for insights. Modern data technologies like Hadoop make it possible to stage data in a platform for immediate access. To structure your data and technology architecture toward a use-case driven model that fosters speed to insight key considerations include: A prioritized list of problems that need solutions Any characteristics or constraints that might impact time to value Available data assets and technologies, like data virtualization, that would enable you to access and analyze data in place Once you adopt methods for analyzing data in place, your team can deliver value on a much shorter timeline. Learn more about data architecture and integration >> Improving data literacy The demand for data-driven insights continues to accelerate. Companies at the forefront of the shift from volume to velocity use analytics pervasively throughout their organization and have the technology and agility to act on insights quickly. To become a competitive, speed-driven organization, your business must excel throughout the analytics lifecycle: Acquire: Harvest data quickly by exploring evolving big data technologies and optimizing first-party data strategies Analyze: Identify the most impactful insights Act: Implement the insights iteratively and strategically That final step involves your organization’s data literacy. Providing insights is one thing, but training your people to take the next right action on the data they see might require new skills. Upskilling the workforce to better understand and use data pays off richly in transformative accuracy, speed, and confidence. Learn more about building data literacy within your organization >> Implementing harvest-to-delivery data governance As the volume of available data continues to increase, businesses are building complementary abilities to understand and use it. But implementing tools and technology to harvest, integrate, and analyze data without robust governance frameworks opens companies up to significant risk. Building strong governance into your data asset creation and management workflows from the start can help. Learn more about how to implement good data governance >> Elevating data leaders As the world becomes more digital, and more customer behaviors move to a mobile context, businesses are changing to meet and match digital footprints with geospatial dimensions. Leadership must keep pace The need for a Chief Data Officer (CDO) at the table isn’t really a question anymore. Today, leading companies are asking where analytics and digital belong in the leadership playbook. To get the most value out of your data management, the right team members — with the right support and authority in place — could not be more important. Learn more about the importance of empowering your CDO >> Data management can be complex. A strategic viewpoint can help. Find out more about Fusion’s approach to strategic data management, or ask us your questions. Wherever you are on your data journey, we can help you keep moving forward.
Articles about Page: Data architecture & integration
Developing a culture of and commitment to viewing data as an asset within your organization not only ensures good governance — and compliance with evolving privacy regulations — it also gives your business the insights needed to thrive in the rapidly changing digital world. Understand the lifecycle of data as an asset To encourage good data management processes, it’s important to understand the lifecycle of a data asset originating outside of IT. In these cases, data from multiple sources is blended and prepped for consumption, which typically includes steps to validate, cleanse, and optimize data based on the consumption need — and because these processes happen outside of IT, be on the lookout for potential security or governance gaps. While individual circumstances vary, from a big picture perspective the data asset development lifecycle generally follows these steps: Intake: Data assets can only be created or derived from other datasets to which the end-user already has access. While traditionally this was more focused on internal datasets, blending with external data, such as market, weather, or social, is now more common. Ask: How are new requests for information captured? Once captured, how are they reviewed and validated? How is the information grouped or consolidated? How is the information prioritized? Design: Once the initial grouping takes place, seeing data as an asset requires thoughtful design that fits in with the structure of other data sets across the organization. Ask: How will new datasets be rationalized against existing sets? How will common dimensions be conformed? How does the consumption architecture affect the homogeneity of data sets being created? Curation: Depending on the source, data might be more or less reliable, but even lower confidence information can be extremely valuable in aggregate, as we’ve seen historically with third-party cookies. The more varied the sources contributing to a data asset, the greater the need for curation, cleansing, and scoring. Ask: How will the data be cleansed and groomed based on the consumer’s requirements? Will different “quality” or certification levels of the data be needed? Output: Organizations that view data as an asset prioritize sharing across business units and between tools. Consider implementing standards for data asset creation that take connectivity and interoperability into account. Ask: How will data be delivered? Will it include a semantic layer that can be consumed by visualization tools? Will the data asset feed into a more modern data marketplace where customers (end users) can shop for the data they need? Understanding: As a shared resource, data assets require standardized tagging to ensure maximum utility. Ask: How will metadata (technical and business) be managed and made available for consumers for these sets? How is the business glossary populated and managed? Access: To maintain legal and regulatory compliance and avoid costly mistakes, good governance requires access management. Ask: Who will have access to various delivered assets? Will control require row- or column-level security, and if so, what’s the most efficient and secure way to implement those controls? Explore tools that streamline data asset preparation In many organizations, the data asset lifecycle is no longer a linear journey, where all data proceeds from collection to analysis in an orderly progression of steps. With the advent of the data lake, the overall reference architecture for most companies now includes a “marshaling” or staging sector that allows companies to land vast amounts of data — structured, unstructured, semi-structured, what some have labeled collectively as “multi-structured” or “n-structured” — in a single region for retrieval at a later time. Data may later be consumed in its raw form, slightly curated to apply additional structure or transformation, or groomed into highly structured and validated fit-for-purpose, more traditional structures. Podium Data developed a useful metaphor when speaking of these three levels of data asset creation. “Bronze” refers to the raw data ingested with no curation, cleansing, or transformations. “Silver” refers to data that has been groomed in some way to make it analytics-ready. “Gold” refers to data that has been highly curated, schematized, and transformed suitable to be loaded into a more traditional data mart or enterprise data warehouse (EDW) on top of a more traditional relational database management system. To streamline the creation of assets at each of those levels, many organizations adopt self-service tools to ensure standard processes while democratizing asset creation. While the vendor landscape is wide in this area, the following three examples represent key functionality: Podium, like Microsoft and others, adopted a “marketplace” paradigm to describe developing data assets for consumption in a common portal where consumers can “shop” for the data they need. Podium provides its “Prepare” functionality to schematize and transform data residing in Hadoop for a marketplace type of consumption. AtScale is another Hadoop-based platform for the preparation of data. It enables the design of semantic models, meaningful to the business, for consumption by tools like Tableau. Unlike traditional OLAP semantic modeling tools, a separate copy of the data is not persisted in an instantiated cube. Rather, AtScale embraces OLAP more as a conceptual metaphor. For example, when Tableau interacts with a model created in AtScale on top of Hadoop, the behind-the-scenes VizQL (Tableau’s proprietary query language) is translated in real time to SQL on Hadoop, making the storage of the data in a separate instance unnecessary. Alteryx is also a powerful tool for extracting data from Hadoop, manipulating it, then pushing it back into Hadoop for consumption. Keep security in mind It is worthy to note that many self-service tools have a server component to their overall architecture that is used to implement governance controls. Both row-level security (RLS) and column-level security (sometimes referred to as perspectives) can be put in place, and implementations of that security can be accomplished many times in more than one way. Many of these tools can leverage existing group-level permissions and security that exist in your ecosystem today. Work with a consulting services partner or the vendors themselves to understand recommended best practices in configuring the tools you have selected in your environment. Whether you’re evaluating self-service data tools or looking for ways to shift your organization’s culture toward seeing data as an asset, we can help. Fusion’s team of data, technology, and digital experts can help you architect and implement a comprehensive data strategy, or help you get unstuck with a short call, workshop, or the right resources to reframe the questions at hand. Read about key considerations for data democratization >> Learn more about data strategy and how to get started >>
Are you using coconuts? Make your quest more efficient. In the classic film Monty Python and the Holy Grail, viewers hear King Arthur and his trusty servant Patsy approaching with a trademark “clip-clop, clip-clop" sound. When the duo emerges from the primordial mist, you see (spoiler alert) that the source of all this noise is not, as might be supposed, a horse. Rather, Patsy is banging two coconut shells together as the king trots about on his own two legs. The duo is getting from point A to point B in their quest, but not in the most efficient or effective way possible. So many companies follow that script. Equipped with buzzword mandates like process optimization and data-driven decision making, it’s all too easy to make small adjustments that sound like you’re headed in the right direction but aren’t necessarily getting you there any faster. How do you drop the coconuts and get on the horse (metaphorically speaking)? What does it look like to use data to drive optimization in real terms? We’ve got our eye on digital twins. Before you run away (how’s that for a deep cut Monty Python reference?) from yet another data buzzword, it’s worth another look at this practical application of machine learning and data analytics. Digital twins are most often used to optimize physical assets and processes like manufacturing, warehousing, and logistics. Using sensors to collect data on a product, machine, or physical process, the digital twin feeds real-time data to a machine learning algorithm to test variables and scenarios faster – ultimately leading to actionable process improvement insights. These days, we’re starting to see more businesses use digital twin frameworks to optimize and innovate non-physical business processes like accounting, HR, and marketing as well. A digital twin simulation can help you surface interdependencies and inefficiencies that might otherwise be blind spots, especially if they’re baked into your business culture as “the way we’ve always done it.” In the quest for digital transformation, don’t settle for coconuts. Instead, let’s talk about the ways your data can carry more of the weight for you. Get smart: If all this talk of Monty Python and the Holy Grail puts you in the mood for an old- school movie night, good news: it’s available on Netflix. And if you’re looking for a more literary scratch for your Middle Ages (ish) itch, we’re reading Cathedral by Ben Hopkins. It’s a fascinating look at the complex processes involved in constructing architectural marvels in the days before edge computing. We may handle optimization differently now, but human nature stays the same.
Internet users produce an estimated 2.5 quintillion bytes of data each day. Yes, that’s quintillion — as in a one followed by 18 zeroes. That’s a mind-boggling amount of data. Yet, every day, that information is mined, analyzed, and leveraged into usable insights that businesses then use to streamline operations, assess risks, track trends, reach a specific target audience, and so much more. Big data, the term we use to describe this vast amount of information, is a goldmine for industries seeking to increase revenue and improve operations. But without a solid strategy for how to use that data, you could scour the internet until the end of time and still not see any gains. Before you dive in to the big datasphere, it’s best to familiarize yourself with what a big data strategy looks like. Then, you can take measured steps to ensure your vision is properly focused and ready to deliver the value you need. What is a big data strategy? A big data strategy is exactly what it sounds like: a roadmap for gathering, analyzing, and using relevant industry data. Regardless of business vertical, an ideal big data strategy will be: Targeted. You can’t hit a moving target, let alone one that’s too nebulous to define. Drill down to the details until stakeholders are aligned on the business objectives they want to reach through your big data strategy. Actionable. Data can be insightful without necessarily being actionable. If your big data strategy doesn’t serve up information usable by the broader team while paving the way for next steps, it likely won’t be beneficial in the long run. Measurable. As with any other business plan, a big data strategy needs to be measurable to deliver lasting success. By measuring your incremental progress, you can refine your strategy along the way to ensure you’re gathering what you need and assessing it in a way that serves your goals. What’s the best way to approach a big data strategy? Now that we’ve covered the basics of what a successful big data strategy entails, let’s turn to how your organization might put one into practice. As we’ve worked with clients across industries, we’ve seen the following six steps deliver wins. Your big data strategy will likely require unique details, but this action plan gives you a starting point. 1. Gather a multi-disciplinary team Big data is not solely an IT project; it’s a business initiative. The team should have more representatives from business departments than from the corporate technology group. Members typically include knowledgeable staff or managers from finance, business development, operations, manufacturing, distribution, marketing, and IT. The team members should be familiar with current reports from operational and business intelligence systems. A common thread? Each team member brings ideas about performance indicators, trend analysis, and data elements that would be helpful to their work but which they don’t already access. More importantly, they know why having that information readily available would add value — not only for their business units, but for the organization as a whole. 2. Define the problem and the objectives What problem should be analyzed? What do you hope to achieve through your strategy? Take three problems you’d like to have solved and formulate them into questions. Limit yourself to three, to start. There will always be more questions to answer. Don’t try to tackle them all at once. Write those questions as the subject line on three emails. Send them to all members of the multidisciplinary team. The replies will guide your efforts in narrowing (or expanding) the initial scope of study. Here are a few questions to get the ball rolling: What do you want to know (about your audience, your processes, your revenue streams, etc.)? Which factors are most important for increasing margin on a given service or product? How much does social media reflect recent activity in your business? Which outcomes do you want to predict? Developing a 360-degree view of all customers in an enterprise may be too ambitious for an initial project. But finding the characteristics of commercial customers who have bought products from multiple lines of business in five key geographic markets might be a more manageable scope right out of the gate. With this approach, iterations in development provide expansion to all lines of business or to all markets in cadence with a company’s business pace. 3. Identify internal data sources Before getting into the technical weeds, you need to know what data exists internally from a functional viewpoint. Gap analysis will uncover incomplete data, and profiling will expose data quality issues. Your first step is just to identify what usable data you have. If customers for one line of business are housed in an aging CRM, and customers for a newer line of business are found in a modern system, a cross-selling opportunity analysis will point out the need to integrate those data sources. Do you have an inventory of data sources written in business language? In forming a strategy, a team will want to have references, such as vendor contracts, customer list, prospect list, vehicle inventory, AR/AP/GL, locations, and other terms that describe the purpose or system from which the data is derived. The list can be expanded for technologists later. Learn how to develop data as an asset >> 4. Find relevant external data sources If you don’t have enough data internally to answer your questions, external data sources can augment what you do have. Public data sites like Data.gov, the U.S. Census Bureau, and the Department of Labor Statistics’ Consumer Price Index have a vast amount of information available to anyone who can operate a search function. Data.gov alone has over 100,000 datasets, some containing millions of rows covering years and decades. Social media is another invaluable source of data. Regardless of industry, Twitter, Facebook, and Pinterest posts may have a greater impact on your operation than you realize. Be sure that a couple of members of the team pursue data from social media sources to include in the initial study. 5. Develop an organizational system One of the most important elements of a big data strategy is organizing the data you collect. Whether it’s analytics dashboards or full-blown data fabric systems, you’ll need a way to organize data in order to analyze it. Decide how and where you want the data to live, how it can be accessed, and who will have access to it. Remember that the more you democratize data, the more your team grows comfortable with reading and handling this information, and the more insight you can glean. However, this also means you’ll need a strong system of management to ensure the data is secure. 6. Get experienced guidance Engaging an experienced team that has led others through data strategy and implementation can help you jump-start your strategy. An external resource skilled in big data management can provide your company with a smooth progression through the many tasks at hand. Your guide should have extensive knowledge of business data elements, or BDEs, which are key to creating understandable and cross-company analytical outputs, including reports, charts, graphs, indicators, and other visualizations. Seek guidance especially if your organization doesn’t have a data glossary, network administration, or knowledge of new technologies, as implementing these can be highly technical and time-consuming. Planning your big data strategy Planning a big data strategy will require you to rethink the way you manage, operate, and analyze your business. But with the right guidance and tools you can develop an effective strategy that positions your company for growth and success. Need a guide on the path to creating your big data strategy? We’re here to help. Reach out to an expert to learn more about how you can leverage big data for your business. Discover our strategic data management services >>
Ready to talk?
Let us know how we can help you out, and one of our experts will be in touch right away.