Achieved a 97% efficiency gain in data processing by leading design outcomes that automated and optimized high-friction workflows
Our platform team transformed legacy Infogroup solutions from an on-prem database with multiple soloedsoftware solutions into a unified, trusted, cloud-based data processing platform, enabling end-to-end B2B and B2C data ingestion, processing, and distribution, which was supported by comprehensive documentation.
I led UX design across all experiences for the Data Axle platform team in Seattle, driving UX strategy and design for both platform innovation and cloud-based tools used by data-processing teams in Omaha.
To establish a foundation, I conducted baseline interviews with Engineering, Data processing, Sales, and Support teams, uncovering challenges across teams. While an immediate design direction wasn’t clear, additional audits and research enabled me to define our UX outcome with success metrics that continued to evolve over time, but were a consistent driving force behind long-term platform innovation and success.
UX OUTCOME:
“Position the Data Axle platform as Infogroup’s leader in data quality, unifying all client software into a single cloud-based platform, empowering data processing users, and enabling engineering through systemization.”
SUCCESS METRICS:
“Words truly cannot express the value that the Data Axle platform delivers to the company and how vital the platform is to our strategy - the best proof is without doubt the rebranding of the entire company from Infogroup to the Data Axle”
- Mike Iaccarino - 2019 - Data Axle CEO
My role:
User research | Design | Illustration | Design socialization and cross-collaboration with Support, Data Cleansing | Product Management, Engineering, Marketing and Branding teams.
< Back to top
SUCCESS METRIC 1
Create intuitive internal tooling software to power data-driven decisions in a cloud-based data processing platform
This case describes the research and design process for many data processing workstreams I worked on while with Data Axle, driving major UX improvements, workflow efficiency, and foundational component innovation that influenced all subsequent experiences. Ultimately, this work transformed Data Axle’s data processing into a comprehensive cloud-based software platform.
This study reflects processes applied consistently across all workstreams, while highlighting specifics from the Merge Review initiative. Similar workflows were followed in Place Management, Tally Reports, Fill Rate Reports, Feedback UI, Corporation Management, and Version History UI.
Each initiative involved user research and the design of components and workflow patterns that improved the UX for Data Processing users, Platform Engineering team, and ultimately B2B/B2C data delivery for customers.
Data processing users benefited from a consistent cloud-based software experience, clear calls-to-action for making data-driven decisions, with greater visibility into how their contributions impacted data quality for external B2B and B2C data customers.
Platform Engineering benefited through the standardization of CSS, a systemized component library, and clear design patterns. My research eliminated ambiguity in engineering workstreams, freeing engineers to focus on technical execution over subjective interpretation, reclaiming valuable development time, accelerating product cycles and reducing cost by reducing development overhead.
B2B/B2C data customers benefited from streamlined operations, higher data accuracy, and real-time data, with API data distribution and ingestion.
SITUATION:
Research revealed major opportunities for efficiency and systematization, highlighting duplicative manual processes across teams and clear paths to streamline operations and reduce costs at scale. At the time, Infogroup’s data processing (ingestion, cleansing, distribution) spanned six siloed teams, lacked end-to-end ownership, and resulted in inconsistent data quality standards, and a time-intensive 6-7 week process.
* To illustrate the extent of the challenge*
The legacy data processing environment was built from a patchwork of single-function client tools developed over the years by Omaha engineering teams. Each tool required manual, human-in-the-loop intervention at every step, and there was little consistency in user experience, quality controls, fidelity, reporting, or cloud integration. Beyond a handful of early Data Axle platform automations, these were the tools the business relied on to remain competitive when I began my Infogroup journey.
Example 1 - Omaha data processing software: When new place submissions were reviewed, they were filtered through a cycle of different software used by separate teams. First landing in New Place UI, if flagged for potential duplicates, they were routed to Match UI, prompting the user with “Do These Places Match?” If unresolved, the record moved to Append UI or Merge UI for further research, then circled back to New Place UI for final confirmation and database entry.
😖 This process forced users to
jump across multiple tools with different interfaces
repeat research on the same data
delay delivery
With a human in the loop at every step, the process became slow, inconsistent, and prone to subjective errors.
Example 2 - Omaha data processing software: Software for a number of the tools was surfaced through an ASCII-based, key-code software that had reporting capabilities, but required an immensely high user learning curve, was difficult to update, and required on-premises user access to a legacy AS400 in Omaha, NE. (This was actually the most effective software being used)
Example 3 - Omaha data processing workflow: For large business files, data teams partially relied on manual spot-checking, sending small subsets through data processing prior to ingestion, rather than processing the entire file, leading to scalability issues and limited quality assurance. (no screenshot).
Example 4 - Data Axle data processing software: The Data Axle platform team's Review UI consolidated many of the functions of the above tools, was the only cloud-based solution out of the bunch, and included history and reporting capabilities, but suffered from major usability issues. This was the best candidate to use as my starting point for designing a more comprehensive, scalable solution.
Breaking down situational challenges into user groups:
Data processing team challenges:
Six separate cleansing teams used different tools and workflows, with no unified process or end-to-end reporting.
Each tool had its own UX, requiring separate learning curves.
Data quality was measured inconsistently across teams.
No visibility into full processing history or timelines.
Ingestion and distribution methods varied by team.
Engineering challenges:
No consistent approach to user research, some engineers conducted none at all and built requested data processing features in somewhat of a silo, on intuition.
Lack of visibility and consistency in CSS, components, and design patterns.
Fragmented experiences and duplicated engineering efforts.
B2B /B2C end user challenges:
Monthly loads delivered outdated data files.
Quality was inconsistent due to the ad-hoc spot-checking during large file ingestion and the fragmented, team-specific measurement methods
Data delivery was inconsistent—via express mail, email attachments, or cloud download.
* Thanks for reading my “brief” explaination of the challenges I met. Now on to the good stuff!
PROGRESS METRICS:
“Unify fragmented on-prem workflows and client tools into a single, cloud-based data processing pipeline, covering ingestion, cleansing, and distribution. This pipeline would streamline data-processing team workflows, leverage a component-based framework to improve engineering efficiency, and ensure consistent, high-quality data at scale.”
PER DEMOGRAPHIC
Data processing:
Standardize definitions of accuracy, fill rate, and quality.
Design a consistent, user-centered UX for all data processing workflows.
Define a unified Data Axle pipeline that streamlines legacy workflows, automates steps, and removes redundant manual processes.
Support every cloud-based workflow with clear, accessible documentation.
Engineering:
Document all design direction to provide clarity and consistency.
Establish a consistent, trusted practice for UX-led user research.
Define and adopt a common style guide, component system, pattern library, and interaction models.
B2B /B2C end user
Replace static monthly loads with a scalable, real-time solution.
Enable ingestion and distribution through APIs to support continuous data feeds.
PROCESS:
* The following process explanation focuses on one workstream for Merge Review, but is similar for all related workstreams for the other Data Axle platform design initiatives.
While on-site in Omaha, I conducted ethnographic research using Data Axle’s processing software and identified several UX friction points. With only a few days on-site, I prioritized one high-impact issue: researching submitted values forced users to constantly switch between the UI and external browsers, relying on inconsistent methods like quick keys, bookmarks, or manual queries.
I targeted this as an early win, streamlining the process to reduce context switching and ease cognitive load.
I designed a “research ribbon”, including the five most-used research links as clickable buttons:
Securities and Exchange Commision (SEC) link
A Google search on the place name
A Google map search on the place name
Facebook link
Any associated anchors
A couple of platform engineers had joined me on the trip and were able to quickly add the research ribbon, so I could validation test with users the next day.
User loved the research ribbon!
During the validation test I was able to focus on other important usability issues and gather additional insights to take back to Seattle with me, informing the following design decisions.
Additional design insights included more in-depth revisions:
Utilizing branded icons for the research ribbon buttons
With the growing presence of Data Axle software, both users and engineers would benefit from more consistency of both content presentation and standardization on common usability patterns.
Users needed a better top-left to bottom-right page read
Addition of contextual app title, naming this application “Merge Review” and subtitle that provided context for the data being reviewed
I move Submit from top right and designed a bottom right action panel, following users page read through the current and submitted values
Place/Business address and important identifying data standardization - I co-located related content - Place name, address, Infogroup ID (IGID), and a timestamp, providing context of where the suggested values originated from
Addition of a record note - users could provide any additional context they found during their research
User’s performance was rated by the number of tickets they reviewed during their workday. If they were stuck on a record, this would affect their performance rating. I designed the ability to skip or reject records if they weren’t able to make a determination, co-locating the Skip option in the bottom right action panel
All current software missed opportunities for effective calls-to-action… I didn’t quite crack this egg yet, but included the research ribbon color pop and blue highlight for “Submit” button after a change had been made was a good start. 🤓
I designed these revisions, included them in our QA environment, and conducted a remote validation test with users before implementing the revisions in production.
Following the platform team’s rebrand and a handful of other design initiatives (in separate work streams), I led a Merge Review redesign about a year later, advancing both functionality and the evolution of our platform’s software design language.
Conducted and synthesized iterative research studies with senior stakeholders, data-cleansing users, support staff, and engineering teams to ensure holistic alignment.
Combined remote and on-site research with data-processing users in Omaha, Nebraska.
Defined common CSS standards, new components, and a pattern library to unify design.
Designed the end-to-end software experience.
Created high-fidelity Axure prototypes for user testing and broad internal socialization
Socialized research insights and led design brownbags and workshops
Created robust online software and pipeline documentation
Research insights:
Small click targets: Radio button choices were only 16x16 pixels, making fast selection difficult.
Cluttered UI: Users frequently complained about crowded screens and information overload.
Skip function abuse: The added “skip” feature returned records to the team queue, enabling users to pass difficult reviews to teammates 😖.
Context tracking: When working on large business records, users lost track of which fields had been updated and which still needed review.
Notes: While record-level notes existed, users wanted field-level notes for more contextual feedback.
Sensitive data: Data processing required a way to suppress sensitive business and consumer data and remove it from deliveries.
Revision solutions:
Designed an uncluttered UI with larger hit targets - enabling the ability to easily locate and click larger click targets for radio button selections, highlighting important calls-to-action.
To reduce misuse of the Skip function, I designed an alternate personal Hold queue. This allowed users to delay work on a record until later, while still maintaining accountability.
The action pane was repurposed into a “More” menu, consolidating both Skip and Hold options.
Hold returned the record to the same user’s queue after a set time.
Skip sent the record back into the team queue.
Both actions were recorded for visibility, ensuring transparency in how records were handled.
I designed a Changelist feature to help users track edits within a record. When changes were made, the system displayed a clear message: “You have made changes to this record.”
A list of edits was co-located in the bottom-right action pane, alongside Reject, Skip, Hold, and Submit options. This allowed users to preview all modifications before finalizing an action, reducing errors, improving confidence, and ensuring transparency in decision-making.
Added the ability to include a field-level note - users could provide context on entire records or focus attention on a record’s specific field value.
Added the ability to suppress fields - enabling Data Axle to withhold sensitive data from all data deliveries
All these Revisions were taken into account in the final design.
RESULTS:
I designed a comprehensive experience grounded in user-centered research, then built a high-fidelity prototype for on-site validation testing. These sessions enabled rapid iteration through workshops with data-processing users before engaging engineering workstreams.
Once the design was finalized, I documented the components and interaction patterns for engineering handoff.
For Merge Review, and all subsequent Data Axle software, I also defined a call-to-action blue highlight pattern. This focused user attention on key decisions in each record review and enabled faster page scanning for data-processing users.
This Merge Review workstream reflects a consistent theme of my UX work at Data Axle, which drove consistent, user-centered impact: unifying fragmented on-prem client software into a streamlined, cloud-based platform that empowers data-processing teams, accelerates engineering, and delivers higher quality data to B2B and B2C customers.
Data processing user outcomes:
Contextual page titles for better orientation.
Clear, intuitive page layouts that supported faster scanning.
Prominent calls-to-action enabling confident, data-driven decisions.
Greater visibility into how individual contributions affected downstream data quality.
Consistent definitions of “quality” across all teams.
Standardized taxonomy across software.
An end-to-end consistent user experience across workflows.
Engineering outcomes:
A standardized CSS library.
A unified Bootstrap component and pattern library, balancing user-specific needs with consistency, and providing visibility and control over variations.
Defined, consistent interaction patterns.
Reduced development overhead 💰.
Accelerated product cycles.
My research provided context and alignment that eliminated ambiguity in engineering workstreams, freeing engineers to focus on technology rather than one-off UX decisions, and reclaiming valuable development time..
B2B and B2C data customer outcomes:
Transitioned from monthly data loads to real-time asynchronous drops.
Enabled ingestion and distribution through API access via AWS.
Delivered more consistent, higher quality data.
Enabled real-time access to business and consumer data.
The redesigned cloud-based platform drove impact :
Empowered data-processing teams to make faster, more informed decisions with full pipeline visibility.
Accelerated engineering through clear UX research, defined design systems, and standardized patterns.
Produced higher quality data for B2B/B2C customers in a fraction of the time.
Overall, these internal tooling optimizations and engineering automations achieved a 24-hour data turnaround and a 97% efficiency gain.
SITUATION:
Infogroup B2B and B2C data customers relied on a patchwork of data teams that supplied varied data solutions and lacked consistent quality, management, and supporting documentation. As one of the data teams at Infogroup, the Data Axle platform team needed a bold visual identity to stand out within Infogroup’s fragmented data ecosystem and present a modern narrative of real-time data access within Infogroup and the B2B/B2C data industry.
PROGRESS METRICS:
“Create impactful branding to position Data Axle as the leader in Infogroup data quality for internal teams and end customers,” focusing on real-time data ingestion, cleansing, availability, and distribution, while highlighting search, submission, and subscription capabilities.”
PROCESS:
Audited the multiple Infogroup data processing team’s branding and customer positioning
Audited B2B/B2C data industry branding and customer positioning
Leveraged these audits and internal stakeholder input to design an impactful logo and identity that reinforced Data Axle platform’s core strengths.
Collaboratively refined with platform, branding, and marketing teams.
RESULTS:
Designed a Data Axle logo inspired by core real-time platform technology of data ingestion, cleansing, availability, and distribution, while highlighting search, submission, and subscription capabilities.
Extended the logo use to all platform software branding, business and conference collateral, and an office environment mural.
Created a modern branded solution that differentiated the platform team solutions from other fragmented Infogroup teams, and was aligned with industry branding trends
This work enabled the Data Axle platform to provide a visible, consistent, trusted signal of data quality across internal teams and for Infogroup customers. Over time, customers began associating all high-quality Infogroup data solutions with the Data Axle platform and this visual identity. This association was so prevalent that he company adopted Data Axle as its new name, rebranding entirely around the platform’s principles, design, and proven success.
< Back to top
SUCCESS METRIC 3
Design and illustrate a visual identity system to showcase Data Axle’s capability-driven value and impact
SITUATION:
Research and prospect interactions revealed clear communication gaps. The complexity of Data Axle’s technology, data models, and processes often made it difficult to convey capability-driven value, especially to non-technical thought leaders who weren’t engineers. Simplifying this conversation was essential to demonstrating impact and building understanding across less-savvy audiences.
To address this, I created simplified narratives, visual frameworks, and UX storytelling artifacts that translated technical complexity into clear, capability-driven value, enabling less-savvy stakeholders to quickly understand impact and align on opportunities.
PROGRESS METRICS:
“Enable a more effective learning path and better conceptual alignment of the capability-driven value, for less-technical users, while promoting trust, quality, and technical accuracy for all users.” I aimed to support technical concepts with clear visual narratives and data visualizations
PROCESS:
I conducted UX research and design iterations with broad socialization and cross-functional participation to help bring the teams along with me in this illustration adventure. I needed confirmation that I was hitting the right notes for both internal and external users.
Research insights
All user bases would welcome the additional layer of capability-based visual clarity
The largest concern and my primary design challenge was to ensure that any added images provided clarity and deepened user comprehension without oversimplifying the intent
I utilized and extended the messaging, intention, and visual style of previous visuals I had created for the Data Axle platform logo (data ingestion, data cleansing and real-time data availability)
Socialized all concepts and designs across product and engineering teams for coverage and alignment
Usability tested a number of concepts to ensure a balance between visual language and technical content was achieved
Continued Iterative illustration and long-term support for the vision
RESULTS:
Connected real-world use cases with platform data and capabilities, by creating a visual identity system for Illustrations, infographics and data visualizations.
Created more approachable, consistent visual language, providing levity to the dry technical documentation that was focused mainly on implementations.
Extended the visual narrative to support Business data, Consumer data, and B2C database content.
Updating and adding to this visual language was a continual parallel effort, as the narrative of Data Axle data was in constant evolution.
Database logos
Created logos for the platform’s 3 distinct databases:
Places = Business data - 20 million records | 200M contacts | 469 attributes
People = Consumer data - 300M records | 500M emails | 311 attributes
B2C = 116M B2C links | 88M business emails | 415 attributes
API artwork
I utilized and extended the intention and visual style of the Data Axle logo and previous platform artwork, while focusing on specific intent of each API.
Created a card-like structure that could be replicated across our API family to create distinct illustrations that represent platform core functionality, while alluding to Data Axle search, submit, and subscribe tenets.







Platform artwork
Created Illustrations, infographics, and data visualizations to supplement conference media, presentations, monthly newsletters, and email blasts, providing a more colorful and approachable understanding of the concepts and technologies available through Data Axle.
Conference animation
Narrated by our CTO at the conference (narration not available).
The Data Axle visual identity system evolved into a vibrant, capability-driven design language and a visual partner that brought clarity to users across the complex data, platform software, presentations, conferences, and stakeholder interactions.
I created simplified narratives, visual frameworks, and UX storytelling artifacts that translated technical complexity into clear, capability-driven value. I used diagrams and workflow visualizations to illustrate how the platform handled ingestion, processing, and distribution at scale. I facilitated workshops with cross-functional teams to test messaging and refine terminology that resonated with non-technical leaders. I also developed prototypes and guided demos to ground conversations in user outcomes rather than abstract technology. Together, these efforts helped cross-functional stakeholders quickly understand the impact, align on priorities, and build confidence in Data Axle’s long-term vision.
< Back to top
SUCCESS METRIC 4
Design and drive the adoption of comprehensive cloud-based documentation storytelling, that engages and empowers all users
SITUATION:
Documentation was inconsistent, fragmented, and overly technical, lacking version control and a central source of truth. As a result, less-savvy product leaders struggled to understand Data Axle’s data, services, and API capabilities. This forced them to rely heavily on support and engineering teams, creating significant team debt and slowing efficient platform adoption.
PROGRESS METRICS:
“Deliver effective cloud-based documentation for both users and internal teams from a single, accessible, and consistent source of truth, simplifying the complexity of platform technology and clearly communicating its capability-driven value.”
My goals were to eliminate inconsistencies and redundancies, extend platform clarity, and align all documentation with Data Axle’s new visual identity system.
PROCESS:
I led research and discovery to assess existing documentation, then partnered with Product management and Engineering to identify communication gaps, and highlight opportunities for improvement. From this, I designed a cloud-based documentation portal that prioritized ease of navigation and comprehensive coverage, and utilized our visual identity system to communicate the capability-driven value.
Process -
Research & alignment – Socialized findings across teams, identifying common patterns, communication gaps, and requirements.
Unification – Consolidated fragmented resources into a single documentation system for Business, Consumer, and B2C data, improving clarity, consistency, and access.
Design & structure – Created a portal with clear categorization and navigation across datasets and API documentation.
Iteration – Initial assumptions favored parallel text explanations for less-savvy users, but research showed this bloated content and hindered technical audiences.
Design evolution -
Through additional user research, I defined an opportunity to explain capabilities through infographics and documentation illustrations aligned with our visual identity. I created redesigned documentation and functional proof-of-concept prototypes, enabling users to navigate and provide targeted feedback through validation testing.
Implementation -
I partnered with the platform team to implement and support the portal over time. The documentation system became an ongoing, evolving process that grew in parallel with the evolution of Data Axle’s data story.
RESULTS:
Created highly useful and easily accessible online documentation.
Sales teams, data-cleansing teams, technical support teams, internal engineering teams and B2C/B2C end-customers had access to the most current documentation from one consistent, trusted source
The new cloud-based documentation was accessible and easy to navigate with the ability to switch between People/Consumer data, Places/Business data and B2C linking data
Documentation included:
Data & APIs
Getting Started and Onboarding
Data Axle data and Delivery Schema Introductions
FAQ and Support access
Data Guides - all data segments
Data Dictionary
Lookup tables
Packages
Field Translators
Less-savvy product leaders were better equipped with a more holistic understanding of the data they were licensing
Support teams and Data Axle engineering teams recouped valuable time.
Illustrations extended the Data Axle branding look and feel into a more robust, holistic visual identity
Verbiage, taxonomy, and overall messaging were iteratively updated and adjusted as our offerings evolved.
“Data Axle’s online documentation is our new knowledge center for Infogroup’s core databases.”
- Mike Iaccarino - Data Axle CEO
Utilized API illustrations in API documentation
3 Data Guide examples with illustrations
Additional illustrations









Other important documentation pages
Data Dictionary - Definitions of all the data fields available in Data Axle.
Lookup Tables - efficient repositories for predefined values / mappings
Packages - predefined data segments
Field Translator - Legacy system / Data Axle translations
The new Data Axle cloud-based documentation enabled valuable access to self-service support documentation and a comprehensive guide for internal teams and both new and existing customers, empowering technical teams and their less-savvy cross-functional partners with a deeper, shared understanding. 👍🏼
< Back to top
SUCCESS METRIC 5
Design a self-signup and free trial workflow to empower user onboarding at scale while reducing support overhead
SITUATION:
The New customer onboarding process required substantial introductory relationship management and hand-holding. Quality and coverage varied widely, depending on who facilitated onboarding and how much time they could dedicate, resulting in inconsistent experiences and outcomes.
As our initiatives successfully drove customer engagement and productivity, platform product managers, engineering, and support teams were frequently pulled into randomized conversations with new or prospective customers, discussing data solutions, technologies, and free trial setup. It was a complementary problem to have 🙃, but one that regularly diverted focus away from product work.
PROGRESS METRICS:
“Create a self-signup process for customers to register for a free trial of platform APIs and explore datasets aligned to their business needs, educating them on the data model while reducing dependency on Product, Sales, and Support teams.”
PROCESS:
Partnering with engineering leadership, platform team engineering, key stakeholders, support team and external users, I drove a cross-functional, user-centered research and data-driven design process.
Conducted and synthesized research studies across multiple product offerings, each with its own authentication and heavy support signup process.
Socialized research and design through cross-functional workshops to gain internal feedback and promote a common understanding of the customer’s journey, and expectations across all workflows
Designed and prototyped cohesive workflows
Conducted usability studies to validate positioning and flow.
OUTCOME:
I designed a self-signup workflow for account creation with a free 30-day trial of our business, consumer or B2C data.
Empowered customers with hands-on engagement and learning
This streamlined onboarding experience drove clarity, adoption for customers
Substantial increase in customer data licensing trials
Dramatic reduction of product management, sales and support randomization, enabling substantial cross-functional cost savings for internal teams, and the reprioritization of their time for other important business initiatives.
Operational cost savings for internal teams
Free 30-day trial for customers to better understand our data and what they want prior to financial discussions in their organization
Highlighted the need to expand self-service efforts 🧐
Unfortunately, this story is a little light because I was not able to see this effort through due to a company-wide restructuring. I wasn’t able to gather more details. 🙃
I led a UX-driven overhaul of Infogroup’s data-processing pipeline, software and platform team’s visual language — uniting design and engineering to transform Infogroup’s culture, accelerate cross-team efficiencies, and elevate user productivity! 🤓
The Data Axle platform became synonymous with data quality and trust both internally and externally for our B2B customers. These initiatives were transformational for the platform team and Infogroup as a whole, and heavily influenced a reimagined company-wide strategic long-term vision and rebrand to Data Axle.
P.S. Here’s a little positive press from my time at Infogroup/Data Axle…
Infogroup announced as a leader in the Forrester Wave -
B2B Marketing Data Providers 2018 report
Key findings regarding Infogroup in the report:
Infogroup received the highest possible score in several criteria, such as data management, integrations and APIs, go-to-market strategy, product roadmap, revenue, and customer base.
The report highlighted Infogroup's capability to identify and integrate personal and professional insights on executive buyers for improved targeting and personalization.
Infogroup's advanced data management was noted for enabling customers to integrate data sources and create detailed profiles.
Hello there! 😊
Have any questions or want to discuss? Connect with me on Linkedin or send me a message below to hear more.