Achieved higher quality data and a 97% efficiency gain in data processing, by automating and optimizing high-friction workflows
Our platform team unified Infogroup’s fragmented, on-premises systems into a cloud-based data-processing platform supporting end-to-end B2B and B2C data ingestion, processing, and distribution.
I led UX design across all platform experiences, defining user-centered strategy and systemized design direction for Data Axle’s engineering and processing teams in Omaha.
Collaboration helped co-define the evolving ideal user experiences for each initiative with goals that aligned teams, streamlined the data-processing pipeline and customer communications, and drove long-term platform innovation and customer impact. This work elevated the platform as a leader in data quality within Infogroup, and delivered more approachable and scalable B2B and B2C data offerings.
“Words truly cannot express the value that the Data Axle platform delivers to the company and how vital the platform is to our strategy - the best proof is without doubt the rebranding of the entire company from Infogroup to the Data Axle”
- Mike Iaccarino - 2019 - Data Axle CEO
PROJECTS:
My role:
User research | Design | Illustration | Design socialization and cross-collaboration with Support, Data Cleansing | Product Management, Engineering, Marketing and Branding teams.
Create intuitive internal tooling software to power data-driven decisions in a cloud-based data processing platform
* This case study is by far the longest, focusing on Merge Review while outlining the research and design approach applied across related data-processing design workstreams for Place Management, Tally Reports, Fill Rate Reports, Feedback UI, Corporation Management, and Version History UI, developed in collaboration with Data Processing, Engineering, Product Management, and Support teams.
This work transformed Data Axle’s data processing into a comprehensive cloud-based software platform, systemized development, and re-envisioned data processing software, influencing all subsequent experiences and helping deliver real-time data to B2B/B2C end customers.
Influencing three main demographics:
Product team - Design and Engineering -
Enabled user-centered research and design, standardization, systemization, and the development of a clear design pattern library that eliminated ambiguity in product management, design, and engineering workstreams. This freed engineers to focus on technical execution instead of subjective interpretation, reclaiming valuable development time, accelerating product cycles, and reducing costs by reducing overhead.
Data processing users -
Enabled a consistent cloud-based software experience, clear calls-to-action for making data-driven decisions, with greater visibility into how their contributions impacted overall data quality for external B2B and B2C data customers.
B2B/B2C data customers -
Enabled streamlined operations, higher data accuracy, and real-time data availability, with API data distribution and ingestion.
SITUATION:
Audits revealed opportunities for better efficiency and systematization, with redundant manual processes across teams and clear paths to streamline operations and reduce costs at scale. Infogroup’s data processing (ingestion, cleansing, distribution) spanned six siloed teams, lacked end-to-end ownership, and resulted in inconsistent data quality standards and a time-intensive 6-7 week process.
The legacy data processing environment was built from a patchwork of single-function client tools developed over decades by Omaha engineering and product teams, offering little consistency in user experience, quality controls, reporting, and with no cloud integration. Outside of a handful of early Data Axle platform automations, these were the tools Infogroup relied on for data processing.
Audit of Software / tools - required for context
Example 1 - Match UI, Append UI, Merge UI client-based software: New place submissions were reviewed, and filtered through a cycle of different software used by separate teams. Submissions first landed in Match UI, prompting the user with the question “Do These Places Match?” If unresolved, the record moved to Append UI or Merge UI for further research, then circled back to New Place UI for final confirmation and database entry.
😖 This process forced users to
jump across multiple tools with similar, but different interfaces
repeat research on the same data
make subjective, inconsistent decisions
delay delivery
Example 2 - Omaha AS400 database software: Software for a number of the tools was surfaced through an ASCII-based, key-code software that had reporting capabilities, but required an immensely high user learning curve, was difficult to update, and required on-premises user access to a legacy AS400 (circa 1988) in Omaha, NE. (This was actually the most robust and effective software being used)
Example 3 - Omaha data processing workflow: For large business files, data teams partially relied on manual spot-checking of full data files, sending small subsets through data processing prior to ingestion, rather than processing the entire file, which led to scalability issues and limited quality assurance. This process relied on using Excel spreadsheet data files (no screenshot included).
Example 4 - Data Axle data processing software: The Data Axle platform team's Review UI consolidated many of the functions of the above tools (effectively duplicating them), was the only cloud-based solution out of the bunch, and included history and reporting capabilities, but suffered from major usability issues. This was the best candidate to use in creating a more comprehensive, scalable solution.
Breaking down situational challenges per demographic:
Engineering challenges -
No consistent approach to user research, some engineers conducted none at all and built requested data processing features in silos, on intuition.
Lack of visibility, consistency in CSS, components, and design patterns.
Fragmented experiences and duplicated engineering efforts.
Data processing team challenges -
Six separate cleansing teams used different tools and workflows, with no unified process or end-to-end reporting.
Each tool had its own UX, requiring separate learning curves.
Data quality was measured inconsistently across teams.
No visibility into full processing history or timelines.
Ingestion and distribution methods varied by team.
B2B /B2C Customer / End-user challenges -
Monthly loads delivered outdated data files.
Quality was inconsistent due to the ad-hoc spot-checking during large file ingestion and the fragmented, team-specific measurement methods
Data delivery was inconsistent—via express mail, email attachments, or cloud download.
PROCESS:
A substantial amount of iterative research over time was required to fully understand the the amount of tools and processes the Data Processing teams used to do their jobs. Research was conducted cross-functionally by Design (myself), platform engineers, and product management team members, then socialized through cross-functional presentations or collaborative workshops to the broader team. This audit and iterative research initiatives enabled co-discovery and co-definition of our ideal UX statements and goals (evolving over time) used to guide our journey.
Ideal UX statements - per demographic:
Design and Engineering Teams -
“Design and engineering teams can streamline workflows and reduce overhead through a scalable, component-based framework that minimizes duplication and accelerates development.”
Data Processing Users -
“Data-processing teams can manage and execute workflows to ingest, cleanse, and distribute data more efficiently through a unified cloud-based pipeline, eliminating duplication and reliance on fragmented tools.”
B2B /B2C Customers / End-users -
“Customers consistently receive high-quality data at scale, from a single reliable platform free from delays or inconsistencies.”
Goals - per demographic:
-
Reduce individual engineer siloed UX research efforts.
Establish a consistent, trusted practice for UX-led user research for various sizes of projects.
Reduce duplicated code or design patterns.
Increase adoption rate of a shared style guide and component system across engineering projects vs individual silos.
Define and adopt a common style guide, component system, pattern library, and interaction models.
-
Increase data accuracy, fill rate, and quality measurement consistency across data processing pipelines.
Reduce the number of different tools used across teams.
Increase consistency of workflow patterns across data processing experiences.
Reduce redundancies in manual processes.
Combining similar processes
Automation
Reduce duplication or siloed team documentation sources.
-
Reduce the 100% of customers that are ingesting monthly loads.
Enable consistency whoin distribution
Enable scalability
Enable ingestion and distribution through APIs to support continuous data feeds
👉🏼
* The following explanation focuses on the iterative redesign of Merge Review. Similar processes unfolded for all other Data Axle platform software design initiatives.
Merge Review redesign V1:
While on-site in Omaha, I conducted ethnographic research using Data Axle’s processing software for Merge Review and identified several UX friction points. With only a few days on-site, I prioritized one high-impact issue: researching submitted values forced users to constantly switch between the UI and external browsers, relying on inconsistent methods like quick keys, bookmarks, or manual queries.
I targeted this as an early win, streamlining the process to reduce context switching and ease cognitive load.
I designed a “research ribbon”, that included buttons for the five most valuable research methods. We already had this data in our system, making it easy to repurpose and optimize.
Securities and Exchange Commission (SEC) link
A Google search on the place name
A Google map search on the place name
Facebook link
Any associated anchors
A couple of platform engineers had joined me on the trip and were able to quickly add the research ribbon, so I could validation test with users the next day.
Quick validation testing
The new research ribbon was a hit!
Users loved the new inline research functionality and were excited to see the quick turnaround with their direct contributions from the day before taken into account. This really got users engaged, and they opened up and provided additional valuable usability insights that I expanded on to directly shape the next more robust round of design improvements.
Standardize common usability patterns to improve page flow, contextual understanding, and reduce engineering overhead
Redesign layout to support a natural top-left–to-bottom-right reading flow.
Add clear contextual identifiers — app title (Merge Review) and subtitle — to orient users within the workflow.
Co-locate essential record details (place name, address, Infogroup ID, timestamp, data source) for quick recognition and informed decisions.
Integrate a record-note field so users can capture research context, automatically logged in record history.
Increase efficiency and user control
Consolidate Submit and user action controls within a unified bottom-right action panel aligned to user reading and interaction patterns.
Add Skip/Reject functionality to let users bypass indeterminate records without impacting performance metrics.
Enhance calls-to-action and visual feedback
Refine the research ribbon with branded icons and color accents to improve discoverability and reinforce visual hierarchy.
Apply a dynamic blue highlight to the Submit button after edits, signaling readiness for submission and reinforcing system feedback.
Merge Review redesign V2:
Following the platform team’s rebrand and a handful of other design initiatives (in separate work streams), I led a Merge Review redesign about a year later, advancing both functionality and the evolution of our platform’s software design language. There had been a handful of minor updates over the last year, visible in the comp below, including the initial stages of our Data Axle platform UI redesign and rebrand.
I began by conducting and synthesizing baseline research studies with senior stakeholders, data-cleansing users, support staff, and engineering teams to ensure holistic alignment.
Initial research insights:
Small click targets: Radio button choices were only 16x16 pixels, making fast selection difficult.
Cluttered UI: Users frequently complained about crowded screens and information overload.
Skip function abuse: The added “skip” feature returned records to the team queue, enabling users to pass difficult reviews to teammates 😖.
Context tracking: When working on large business records, users lost track of which fields had been updated and which still needed review.
Notes: While record-level notes existed, users wanted field-level notes for more contextual feedback.
Sensitive data: Data processing required a way to suppress sensitive business and consumer data and remove it from deliveries.
From these insights, I created design epics and associated tasks, and collaborated with engineering and product management to drive design and research iteration and cross-functional visibility and alignment.
Design process:
Conducted remote and on-site user research using these prototypes, with data-processing users in Omaha, Nebraska.
Defined design/engineering pipeline standardizations for common CSS, new components, and the creation of a Bootstrap pattern library to unify our platform design language.
Socialized research insights and led design lunch-and-learns and workshops
Standardized all documentation and created one robust, cloud-based source of truth through partnering with engineering, product management, and data-processing stakeholders.
Enabled larger hit targets
Interactions with radio button, verifications and data entry became less cumbersome.
The new Merge Review was the first to benefit from our color callouts
Call-to-action blue highlights - CTA Blue highlighted important data within workflows that required user attention. In the Merge Review UI, CTA Blue was used to highlight any differences between the current record values and the newly submitted values.
Green changes highlights - All changed values were highlighted with green. This included all verifications, which received a heavy green square with white checkmark.
Included the ability to add a field-level note - users could now provide context on entire records or focus attention on a record’s specific field value, providing the ability to include clarity in record history for changes made.
Enabled users to see and track record edits prior to submission with a changelist - When changes were made, the action pane displayed a clear message: “You have made changes to this record” and displayed the list of edits in green, co-located with the action buttons to Reject, Skip, Hold, and Submit. This allowed users to preview all modifications made before finalizing an action, reducing errors and improving confidence.
Reduced the misuse of the Skip function Added a “More” menu to the action pane, enabled users to select a “Hold” option, and return to the record later.
Hold returned the record to the same user’s queue after a set time.
Skip sent the record back into the team queue.
Both actions were recorded, ensuring visibility in how records were handled.
Enabled the ability to suppress fields - Sensitive customer data could now be manually suppressed from records and removed from all future data deliveries.
I designed scalable, engaging experiences grounded in user-centered research - Built scalable component patterns, system documentation, and high-fidelity prototypes, enabling cross-functional participation, visibility, and rapid iteration through collaborative research and design workshops with Engineering, Product Management, and Data Processing teams.
Fully functional prototypes
Many modals
RESULTS:
This Merge Review workstream reflects a consistent theme of my UX work at Data Axle, which drove consistent, user-centered impact, unifying fragmented on-prem client software into a streamlined, cloud-based platform that accelerated engineering, empowered data-processing teams, and delivered higher quality data to B2B and B2C customers.
These internal tooling optimizations and engineering automations achieved an overall 97% efficiency gain, streamlining a 6-7 week process into a
24-hour data processing turnaround.
The redesigned Data Axle cloud-based platform impact : per demographic
-
Unified design and development standards
Adopted a UX research–led process and a shared design system (Bootstrap-based CSS specification) that standardized components, patterns, and usability principles across all projects.Improved operational efficiency
Reduced design and development overhead through reusable assets and consistent workflows, accelerating product cycles and improving maintainability.Increased visibility and collaboration
Established clearer cross-team visibility into design direction and implementation, strengthening communication between design and engineering.
-
Streamlined, unified data-processing platform
Centralized all data ingestion, cleansing, and distribution workflows into a single cloud-based portal, eliminating fragmented tools and siloed processes.Standardized definitions and quality measures
Delivered consistent standards for data accuracy, fill rate, and quality metrics, enabling teams to assess performance and downstream impact in real time.Simplified, consistent user experience
Introduced standardized taxonomy, contextual titles, and stronger calls-to-action to create a coherent, end-to-end workflow experience across all tools.Consolidated documentation and resources
Merged all data-processing documentation into one authoritative cloud-based repository, reducing confusion and improving self-service access.
-
Real-time, scalable data delivery
Replaced static monthly data loads with real-time asynchronous feeds, enabling continuous access to the most current data.Consistent and reliable distribution with high performance
Established standardized API delivery through AWS, ensuring dependable, scalable, on-demand data access across all customer touchpoints, without compromising speed or quality.
Design and champion differentiating branding for Data Axle platform
SITUATION:
Infogroup’s B2B and B2C customers relied on fragmented data teams delivering inconsistent quality and documentation. As the provider of Infogroup’s most advanced data infrastructure, the Data Axle Platform Team needed a distinct visual identity to stand out internally and communicate a modern, real-time data narrative to both internal stakeholders and external customers.
This is the header of the platforms' cloud-based UI prior to the rebranding initiative, displaying the previously used Data Axle logo and Infogroup lockup.
PROCESS:
I begin this Data Axle branding workstream with a fact-finding mission 🤓
Data-processing team members’ early interviews revealed misconceptions about Data Axle’s role and value. Many saw the platform as simply overlapping with existing workflows. As discussions unfolded, teams began to recognize Data Axle’s potential to unify processes, improve visibility, and deliver consistent quality across data ingestion, cleansing, and distribution.
Team leader conversations reflected a growing optimism. While initial skepticism varied by group, most leaders envisioned how adopting Data Axle could streamline operations and establish a stronger foundation for data accuracy and collaboration across teams.
B2B / B2C customer interviews were most encouraging. Customers praised Data Axle’s near real-time access to critical data sets, transparency into data quality metrics, and comprehensive documentation that detailed packages, lookup tables, field translations, and API availability—key differentiators in the market. The only negative feedback involved differentiation and reduction in the dependence of multiple Infogroup data sources.
Interviewed the platform CTO to understand the intended platform positioning.
Audited Infogroup products’ branding and customer positioning.
I shared these insights cross-functionally and defined the ideal UX statements of this branding initiative with clear goals that guided subsequent iterations on the platform experience.
Ideal UX statement:
“Internal teams and external customers clearly understand and trust Data Axle’s leadership in data quality, with branding that communicates real-time ingestion, cleansing, availability, and distribution, focusing on search, submission, and subscription capabilities.”
Data Axle rebranding goals:
Reduction of confusion in customer conversations
Support conversations
Sales conversations
Conference conversations
Customer research conversations
Process continued…
Conducted brand and industry audit
Analyzed B2B and B2C data industry branding, visual systems, and customer positioning to identify differentiation opportunities for Data Axle.
Combined external audits with internal stakeholder interviews to define the platform’s unique value narrative.
Designed a modern, scalable identity system
Created a standalone logo symbol for use in compact applications such as favicons and icons.
Designed a combination mark pairing the symbol and wordmark to reinforce the platform’s full identity across contexts.
Validated design direction through collaboration
Presented concepts to cross-functional leaders and gathered iterative feedback from the platform, branding, and marketing teams.
Reviewed with the CEO to refine messaging and ensure alignment with the broader organizational vision.
Refined and finalized the identity through iteration
Incorporated cross-team and leadership feedback to iteratively strengthen the logo system, ensuring it captured the platform’s technological rigor and reliability.
Implemented the identity system across all brand touchpoints
Data Axle platform cloud-based UIs and documentation
Company newsletters, presentations, and marketing materials
Infogroup corporate web properties and assets
Business cards, stationery, stickers and environmental graphics, including an office mural celebrating the new identity.
RESULTS:
Created a modern branded solution that differentiated the platform team solutions from other fragmented Infogroup teams, and was aligned with industry branding trends, and inspired by core real-time platform technology of data ingestion, cleansing, availability, and distribution, while highlighting search, submission, and subscription capabilities.
Implemented the identity system across all brand touchpoints
Data Axle platform cloud based UIs and documentation
Company newsletters, presentations, and marketing materials
Infogroup corporate cloud-based properties and assets
Business cards, stationery, stickers and environmental graphics, including signage and an office mural celebrating the new identity.
The new Data Axle identity, paired with the success of the cloud-based Data Axle platform, its data-processing software, APIs, and real-time technologies, established a trusted, unified symbol of data quality across internal teams and customer experiences. This powerful combination of technical innovation and cohesive design elevated the platform’s reputation to such an extent that Infogroup ultimately rebranded the entire company around the Data Axle name, design, and platform principles.
Design and illustrate a visual identity system to showcase Data Axle’s capability-driven value and impact
SITUATION:
Customer research and interactions with prospective customers exposed significant communication gaps. Data Axle’s complex technologies, data models, and processes were challenging to explain to non-technical stakeholders, presenting an opportunity to create a visual narrative that bridged these conversations.
PROCESS:
I conducted UX research and design iterations with broad socialization and cross-functional participation to help bring the teams along with me in this illustration adventure.
Research Insights -
All user bases would welcome the additional visual layer of definition.
The largest concern across teams and my primary design challenge was to ensure that any added images provided clarity and deepened user comprehension without distracting or oversimplifying the intent of the documentation.
Taking these insights into a workshop and UX office hours, I socialized concepts cross-functionally to confirm that any additional visual voice would support and not distract from technical concepts for both internal and external users. These conversations helped frame and define the ideal UX statement and drive key initiatives.
Ideal UX statement -
“Enable a more effective learning path and better conceptual alignment of the capability-driven value for less-technical users - while promoting trust, quality, and technical accuracy for all users.”
Key initiatives:
Utilized and extended the messaging, intention, and visual style of previous visuals I had created for the Data Axle platform logo (data ingestion, data cleansing, and real-time data availability).
Conducted workshops and feedback loops to socialize and gain alignment across product, engineering, and data-processing teams.
Tested a number of concepts with users to ensure a non-biased balance between visual language and technical content was achieved.
Continued Iterative illustration and long-term support for the vision.
RESULTS:
Connected real-world use cases with platform data capabilities, by creating a visual identity system for Illustrations, infographics, and data visualizations that partnered with technical conversations and extended the capability-based visual clarity.
Extended the visual narrative to support Business data, Consumer data, and B2C database content.
Incorporated the playful, informative visual narrative within our API artwork, driving comprehension.
Managing this visual language was a continual parallel effort, as the narrative of Data Axle data was in constant evolution.
Database logos
Places = Business data - 20 million records | 200M contacts | 469 attributes
People = Consumer data - 300M records | 500M emails | 311 attributes
B2C = 116M B2C links | 88M business emails | 415 attributes
API artwork
Created a consistent card structure that could be replicated across our API family - illustrations represented API core functionality, while alluding to database logos and Data Axle identity and search, submit, and subscribe tenets.







Platform artwork
Created Illustrations, infographics, and data visualizations to supplement conference media, presentations, monthly newsletters, and email blasts, providing a more colorful and approachable understanding of the concepts and technologies available through Data Axle.
Data Axle Conference animation
Narrated by our CTO at the conference (narration not available).
The Data Axle visual identity system evolved into a vibrant, capability-driven design language that clarified complex technology across products, presentations, and stakeholder interactions. Through simplified visual narratives and UX storytelling artifacts, I helped cross-functional and non-technical customer stakeholders quickly understand impact, align on priorities, and gain confidence in Data Axle’s capabilities and long-term vision.
PROJECT 4
Design and drive the adoption of comprehensive cloud-based documentation storytelling, that engages and empowers all users
SITUATION:
Documentation was inconsistent, fragmented across teams, and many times overly technical, lacking version control and a central source of truth. As a result, varied quality and accessibility standards existed, and less-savvy internal and external customer product leaders struggled to understand Data Axle’s data, services, and API capabilities. This forced both sets of documentation users to rely heavily on support and engineering teams, creating significant team debt and slowing efficient platform evolution and adoption.
PROCESS:
I audited existing documentation across teams, partnered with Product Management and Engineering to identify communication gaps, highlight and socialize opportunities for improvement, and gain broad alignment and define the ideal UX statement.
Ideal UX statement:
“Deliver effective cloud-based documentation for both users and internal teams from a single, accessible, and consistent source of truth, simplifying the complexity of platform technology and clearly communicating its capability-driven value.”
Goals -
-
Consolidated fragmented resources into a single documentation system for Business, Consumer, and B2C data, improving clarity, consistency, and access.
Created a portal with clear categorization and navigation across datasets and API documentation.
-
Data
API documentation
Data Packages
Lookup tables
Field Translations
-
Provide a helpful visual narrative for Data Axle data when useful t bridge comprehension.
Continually expand and validate.
-
Identifying common patterns, communication gaps, and requirements.
PROCESS:
This was an iterative process of gathering and integrating all data into one comprehensive cloud-based portal.
RESULTS:
I designed and refined a cloud-based documentation portal that centralized all data resources into a single, navigable source of truth. Leveraging Data Axle’s visual identity, the system translated complex capabilities into clear, accessible value for both technical and non-technical users.
The new portal provided self-service support and comprehensive guidance for internal teams and customers alike, fostering a shared understanding and empowering more effective cross-functional collaboration.
“Data Axle’s online documentation is our new knowledge center for Infogroup’s core databases.”
- Mike Iaccarino - Data Axle CEO
API documentation example:
Data Guide example:
Additional illustrations









Other important documentation pages:
Data Dictionary - Definitions of all the data fields available in Data Axle.
Lookup Tables - efficient repositories for predefined values / mappings
Packages - predefined data segments
Field Translator - Legacy system / Data Axle translations
Overall impact
I led a UX-driven overhaul of Infogroup’s data-processing pipeline, software and platform team’s visual language, and comprehensive data documentation, helping unite design and engineering to transform Infogroup’s culture, accelerate cross-team efficiencies, and elevate customer productivity! 🤓
The Data Axle platform became synonymous with data quality and trust both internally and externally for our B2B customers. These initiatives were transformational for the platform team and Infogroup as a whole, heavily influencing and driving company-wide strategic long-term vision and rebranding of the entire company to the Data Axle name.
P.S. Here’s a little positive press from my time at Infogroup/Data Axle…
Infogroup announced as a leader in the Forrester Wave -
B2B Marketing Data Providers 2018 report
Key findings regarding Infogroup in the report:
Infogroup received the highest possible score in several criteria, such as data management, integrations and APIs, go-to-market strategy, product roadmap, revenue, and customer base.
The report highlighted Infogroup's capability to identify and integrate personal and professional insights on executive buyers for improved targeting and personalization.
Infogroup's advanced data management was noted for enabling customers to integrate data sources and create detailed profiles.