Uniform Data System (UDS) Reporting in Healthcare: Everything You Need to Know
The Uniform Data System is an annual reporting system that provides standardized information about the performance and operation of health centers serving underserved communities. UDS reporting is a cornerstone of compliance for Federally Qualified Health Centers and other Health Center Program participants, ensuring accountability for the federal funding they receive.
Each year, health centers must compile a core set of data on patient demographics, services provided, clinical processes and outcomes, staffing, and finances.
This consistent dataset allows the Health Resources and Services Administration to evaluate and improve health center performance, ensure compliance with legislative mandates, and identify trends in access to care, quality, and costs. In practice, UDS reporting in healthcare fulfills multiple purposes: it documents the impact of health centers, guides HRSA’s policy decisions, and supports data-driven quality improvement at both the federal and local levels.
From a funding and oversight perspective, UDS reporting is absolutely critical. In fact, UDS is the primary mechanism HRSA uses to report to Congress on how Section 330 grant funds are utilized. The data reported directly affect patient target assessments, funding levels, and how effective a health center appears “on paper”. Errors or inaccuracies in UDS submissions can therefore create financial risk and paint an incomplete picture of a health center’s impact.
For healthcare administrators, compliance teams, and executives, mastering UDS reporting is not just about meeting a bureaucratic requirement; it’s about demonstrating the value of their programs, securing continued funding, and informing internal strategic decisions. In this comprehensive guide, we will explain everything you need to know about UDS reporting.
What Is the Uniform Data System (UDS)?
The Uniform Data System is a standardized data set and reporting system used by HRSA-supported health centers to report on their activities and performance annually.
In simple terms, the UDS is an extensive annual report that every federally supported health center must submit, summarizing who they served, what services they provided, and how they performed in key clinical and financial areas.
UDS reporting began in the 1980s and was formalized as the Uniform Data System in 1996.
It has since evolved into a robust tool for capturing the breadth of work done by community health centers across the nation. UDS Data Content: The UDS report covers a wide range of information about a health center’s operations. Its core components include:
- Patient Characteristics: Number of patients served and their socio-demographic details (age, sex, race, ethnicity, language, income level, insurance status, etc.).
- Services Provided: Types and quantities of services delivered (e.g., medical visits, dental visits, behavioral health encounters, enabling services).
- Staffing and Utilization: The staffing profile and how patients utilize services.
- Clinical Quality and Outcomes: Key clinical process and outcome measures, often aligned with quality benchmarks (for example, hypertension control rate, diabetes control, cancer screening rates, birth outcomes).
- Financial Data: Costs of operations and revenues, which speak to the health center’s financial efficiency and sustainability.
In essence, UDS is a 360-degree view of a health center’s annual performance. It is a non-duplicated data set covering the entire scope of services included in the center’s approved project. HRSA relies on UDS data to assess the impact of the Health Center Program and to drive quality improvement initiatives. For the health centers themselves, the UDS offers a valuable dataset to analyze their service delivery and outcomes year over year, benchmark against state or national trends, and identify areas for improvement.
It’s important to note that UDS reporting is mandated by law for certain health centers. The report is submitted electronically through HRSA’s online data system and undergoes a rigorous review process. Over time, UDS has also incorporated new reporting elements to keep pace with emerging healthcare priorities. Despite its comprehensive nature, UDS data is aggregate and de-identified at the report level – it summarizes counts and percentages, not individual patient records.
Related: What is UDS Reporting in Healthcare? Everything You Need to Know
Who Is Required to Submit UDS Reports?
Which organizations must file UDS reports? UDS reporting is required of all health centers that participate in or resemble HRSA’s Health Center Program. Specifically, the following entities are mandated to submit a UDS report annually:
HRSA Health Center Program Awardees
Any organization that receives federal grant funding under Section 330 of the Public Health Service Act must report UDS data. This includes health centers funded under each of the four Section 330 sub-programs:
- Community Health Centers (CHC) – Section 330(e) funding
- Migrant Health Centers (MHC) – Section 330(g) funding
- Health Care for the Homeless (HCH) – Section 330(h) funding
- Public Housing Primary Care (PHPC) – Section 330(i) funding
If a health center receives any of these grant funds, it is considered a Section 330 awardee and must submit a full UDS report covering all patients and services within its scope.
Health Center Program Look-Alikes
These are health centers that meet all the requirements of the Health Center Program but do not receive Section 330 grant funding. Look-alikes are required to report UDS just like funded centers. Essentially, if an organization is designated as a health center look-alike by HRSA, it must comply with UDS reporting to maintain that designation.
Certain HRSA BHW-funded Centers
Health centers funded by HRSA’s Bureau of Health Workforce – for example, some primary care clinics that are part of BHW programs- may also be required to submit UDS reports. These are less common and typically specified by HRSA in the grant conditions.
In practical terms, this means nearly all Federally Qualified Health Centers in the U.S. and its territories, which include over 1,400 organizations operating thousands of service delivery sites, complete UDS reports each year. This encompasses community clinics serving agricultural workers, homeless populations, public housing residents, and the general medically underserved populations. The requirement spans across all 50 states, D.C., Puerto Rico, the U.S. Virgin Islands, Pacific Basin territories, and any new expansion sites that came on board during the year.
Timing of eligibility
There is an important cutoff date: a health center that is initially funded or designated before October 1 of the reporting year is required to submit a UDS report for that year.
- For example, if a new start grant is awarded in September, the health center will need to report data for the portion of the year it was operating.
- Conversely, if an organization becomes a new start or look-alike in late October or after, HRSA generally does not require a UDS report for that year.
- This “October 1” rule is explicitly stated by HRSA to clarify reporting obligations for new awardees.
- Health centers that underwent a merger or acquisition during the year are also handled under this rule – if the entity existed before Oct 1, it must report, possibly consolidating data as needed.
Universal vs. Grant Reports
All UDS-reporting health centers must submit a Universal Report, which is the aggregate report of all patients and activities.
- Additionally, health centers that receive multiple distinct Section 330 grants must submit separate Grant Reports for each of those special population programs.
- Grant-specific UDS reports are essentially a subset of the data, focusing only on patients served in the Migrant, Homeless, or Public Housing programs.
- For instance, if a health center receives both CHC and HCH funding, it will do one Universal UDS and one Grant Report just for the HCH segment.
- The vast majority of health centers only have one grant, so they only do the Universal report.
- Those with multiple funding streams must be careful to ensure their grant-specific data is a subset that doesn’t exceed the universal counts.
UDS Reporting Timeline and Annual Cycle
UDS reporting follows a well-defined annual cycle aligned with the calendar year. Understanding the timeline is essential for planning the data collection and submission process. Here is an overview of the UDS reporting timeline and key milestones in the annual cycle:
Calendar Year Data Collection (Jan 1 – Dec 31)
UDS is a calendar-year report, meaning it covers all in-scope activities from January 1 through December 31 of a given year. Health centers should be continuously collecting and organizing the required data during this period. Even if a health center’s fiscal year is different, for UDS purposes the reporting period is always the calendar year (Jan–Dec). All patients and services within the health center’s approved scope of project during that year must be accounted for.
Preliminary Reporting Environment (Late October)
In the fall, HRSA opens a Preliminary Reporting Environment for UDS. The PRE is essentially a test environment where health centers can start entering their data before the official reporting window opens.
It allows teams to get a head start, familiarize themselves with the electronic forms, and even identify any data issues early. Along with the PRE, HRSA usually provides offline tools in the fall to assist centers in preparing their data. Using the preliminary period is optional but highly encouraged, as it can reduce last-minute crunch and flag any errors that might need correction while there’s still time in the year.
Reporting Window Opens (January 1)
On January 1, immediately after the reporting year ends, the official UDS reporting period begins. The UDS reporting module becomes available in HRSA’s Electronic Handbooks system on January 1.
At this point, health centers can begin entering their finalized data into the EHB online forms for the Universal Report and any Grant Reports. Practically, many health centers use the first couple of weeks of January to run final reports from their systems and populate the UDS tables.
UDS Submission Due Date (February 15)
February 15 is the key deadline – the UDS report must be submitted by February 15 for the prior year’s data. (If Feb 15 falls on a weekend or holiday, HRSA may adjust the deadline to the next business day, but the guidance consistently uses Feb 15.)
- By this date, the health center’s UDS data should be complete, reviewed internally, and officially submitted through EHBs.
- Missing the February 15 deadline is considered non-compliance, so health centers are strongly advised to plan to submit a few days early to avoid any technical glitches or last-minute issues.
In fact, HRSA’s guidance suggests having the report ready for internal review at least 5 days before submission. Once submitted, the report is locked for HRSA review (though it can be unlocked later for corrections if needed, as described next).
HRSA Review Period (Feb 15 – Mar 31)
After submission, there is a review period from February 15 through March 31 during which HRSA’s UDS reviewers examine the data.
- Each health center is assigned a UDS reviewer who will analyze the submitted data for accuracy and consistency.
- This period is an interactive process: the reviewer may send queries or feedback to the health center if certain data points look unusual or violate validation rules.
- The EHB system itself flags many edit checks automatically – for example, if a total on one table doesn’t match a total on another, or if a value is far outside expected ranges, an electronic edit will trigger.
- Health centers must respond to these edits either by correcting the data or providing explanatory comments.
- During the review period, the UDS report can be unlocked by the reviewer to allow the health center to make necessary corrections.
- It’s critical to address all questions and edits promptly and substantively; explanations like “Verified with our EHR” or “That’s what our data shows” are not considered sufficient.
- Reviewers expect either a correction or a meaningful explanation grounded in the health center’s context.
Final Corrections and Data Lock (by March 31)
March 31 is the final cutoff for any revisions. All corrected submissions must be finalized by March 31. After this date, the UDS database for that year is closed – no further changes are permitted.
Essentially, March 31 is when HRSA locks in the data for analysis and reporting. Health centers should ensure that by this date, every table is complete and that the person with submission authority has marked the UDS report as complete and accurate and resubmitted if it was unlocked for changes. The final submission includes a certification attesting to the accuracy of the report.
Post-Submission and Feedback (Spring/Summer)
After March 31, HRSA compiles the national UDS data. In the months following, health centers typically receive some feedback or final correspondence from HRSA – for example, a report summarizing their data or a comparison to state/national averages. HRSA also uses UDS data to determine Quality Improvement awards and other recognitions later in the year. By August or September, national and state UDS summaries are published on HRSA’s data website for public viewing. Health center leadership often debriefs internally on the UDS results once final, using them to identify successes or areas for improvement.
Core Components of the UDS Report
The UDS report is extensive, comprising multiple tables and sections. However, it can be understood by breaking it down into its core components or content areas.
Each component corresponds to major questions: Who are the patients? What services were provided? Who provided the services? How well did we perform clinically? What did it cost, and how was it paid for? Below are the core components of a UDS report:
Patient Profile (Demographics)
This component captures who the health center’s patients are. It includes total unduplicated patient counts and breakdowns by characteristics such as age group and sex assigned at birth, race and ethnicity, language preference, and income relative to the federal poverty level and insurance status. It may also count special populations. This section answers what populations the health center is reaching.
Service Area (Geographic)
UDS also looks at where patients reside via the ZIP Code Table. The ZIP Code data lists how many patients come from each ZIP Code in the health center’s service area, along with their insurance coverage mix. This helps illustrate the geographic reach and penetration of the health center.
Staffing and Utilization
These components show what services are provided and by whom. Table 5 details the full-time equivalent staff by role and the number of patient visits and patients by service category.
It essentially links human resources to service outputs. There is also a Table 5 Addendum that provides a more detailed breakout of certain services, specifically mental health and substance use disorder services, by the type of provider delivering them. The staffing and utilization data together show how the health center’s capacity translates into services delivered.
Clinical Measures (Quality of Care and Outcomes)
UDS includes tables that track clinical quality performance. Table 6A captures counts of patients and visits related to key diagnostic categories and services, for example, the number of patients with hypertension and the number of hypertension-related visits, or the number of prenatal care patients and related visits. It’s a mix of disease burden and service provision statistics.
- Table 6B is a set of clinical quality measures, typically reported as denominators, numerators, and resulting percentages.
- These measures align with standard quality indicators like:
- Percentage of hypertensive patients with controlled blood pressure,
- Percentage of diabetic patients with HbA1c under control,
- Cervical cancer screening rate,
- Childhood immunization status,
- Weight assessment and counseling for children, etc.
- Table 7 focuses on health outcome measures, this included outcomes like birth weight for prenatal patients and perhaps a rollout of clinical outcomes by race/ethnicity to identify disparities.
- Collectively, the clinical tables demonstrate how well the health center is performing in key clinical domains and health outcomes for its patients.
Financial Data (Costs and Revenues)
UDS collects detailed financial information to show how resources are used and obtained.
- Table 8A reports all operating costs of the health center, broken into categories. It distinguishes direct costs for clinical services and overhead/administrative costs.
- Tables 9D and 9E cover revenues: Table 9D is Patient Service Revenue by payer – it lists gross charges, adjustments/discounts, and collections for each major payer category, including sliding fee adjustments and bad debts.
- Table 9E is Other Revenue, such as federal grants, state/local grants, fundraising, and other income.
Together, these financial tables show whether the health center’s costs are balanced with revenues, the mix of funding sources, and the extent of uncompensated care. HRSA and health centers use this information to assess financial viability and efficiency.
Other Reporting Forms
In addition to the main tables above, the UDS report includes a few supplemental forms that capture other important data:
- Health Information Technology (HIT) Form (Appendix D): This form collects information on the health center’s use of health IT, including whether they use a certified Electronic Health Record, how they use data, patient portal adoption, and any other health IT capabilities. It essentially gauges the level of EHR integration and technology use in the practice.
- Other Data Elements Form (Appendix E): This covers a few specific topics that HRSA tracks, such as Medications for Opioid Use Disorder provided, use of telehealth, outreach and enrollment assistance provided, and voluntary family planning services. These data points don’t fit neatly into the main tables, so they are asked separately.
- Workforce Form (Appendix F): This form asks about workforce development and satisfaction, including any initiatives for staff training and whether the health center conducts and uses results from provider/staff satisfaction surveys. It reflects HRSA’s interest in health center workforce well-being and development.
Each of these core components is represented by one or more tables in the UDS report. In total, the UDS for a full-year includes 11 tables and 3 forms that cover all the demographic, clinical, operational, and financial data described. The Universal Report contains all these tables/forms for the entire patient population, while any Grant Reports contain a subset for the specific special population program.
Data Sources Required for UDS Reporting
UDS reporting draws on data from multiple systems and sources within a health center. Preparing a complete UDS report is an interdisciplinary data effort. Knowing where each piece of data will come from is crucial for efficient and accurate reporting. Below are the key data sources typically required for UDS and what they contribute:
Electronic Health Record / Practice Management System
The EHR is often the primary source for clinical and demographic data. Patient registration information in the practice management module provides demographics such as
- Date of birth,
- Sex,
- Race,
- Ethnicity,
- Preferred language,
- Income, and
- Insurance type.
The EHR’s clinical encounter data provides information on visits and diagnoses/procedures that feed into Tables 5, 6A, and 6B.
For example, EHR reports can count how many visits each provider had, how many patients were diagnosed with hypertension, and calculate performance on quality measures like blood pressure control by analyzing structured data in the EHR.
Billing/Claims System
Many health centers use an integrated practice management system for billing or a separate billing system. The billing records are the source of financial data, especially for Table 9D. Charges, payments, and adjustments are recorded by the payer when claims are processed.
To report UDS financials, the finance or billing department will run reports of total charges by payer, total collections by payer, write-offs, and sliding fee discounts given. These numbers populate the revenue table. If the EHR and billing are integrated, this may be just one system query. But if not, the billing software or financial ledgers must be queried for those figures.
General Ledger / Accounting System
The expense data usually comes from the organization’s accounting system or general ledger. The finance team typically tracks expenditures by category and sometimes by program.
- For UDS, those expenses need to be allocated into the specific categories required.
- Many health centers maintain a chart of accounts that mirrors UDS categories or can be mapped to them.
- Finance staff will prepare Table 8A using the yearly trial balance or audited financial statements as the basis, ensuring that the total expenses reported in UDS match the organization’s actual expenditures for the year.
Human Resources/Payroll Records
Information about staffing often requires input from HR records. While one might extract FTEs from the practice management system scheduling data, it’s typically more accurate to use HR data to determine the annualized full-time equivalents of each category of personnel.
- For example, the HR department can provide a list of all employees or contractors, their roles, and their FTE status.
- This is aggregated by category to report the total FTE for each.
- Additionally, if Table 5A or the Workforce Form is in use, HR would provide data on staff longevity, training, and satisfaction survey results.
- In many health centers, a UDS coordinator will work closely with the HR manager to get the staffing numbers right.
Clinical Logs / Other Program Data
Some UDS elements might not be fully captured in the EHR. For instance, enabling services may not always be recorded as encounters in the EHR if those are not billable visits.
Health centers often keep internal logs or use EHR placeholders to track enabling service encounters. Similarly, outreach and enrollment assistance counts might come from logs kept by outreach workers.
The number of patients on Medication-Assisted Treatment for opioid use disorder might be tracked via a registry or pharmacy records, if not easily pulled from the EHR. Telehealth visits might need a filter or flag in the encounter data to count them specifically. Thus, part of UDS prep is identifying any data that lives outside the main systems and gathering it.
Survey or Reporting Tools
For some parts, like the Health IT Form, the information might not be in any automated system but known to IT leadership. These often require someone knowledgeable to answer yes/no questions or provide counts. Likewise, the Workforce satisfaction information might come from an internal staff survey; if the health center did one, they would report whether they did it and how they use the results, which might be documented in a narrative or QI report rather than a database.
Role of the EHR in UDS Reporting
Electronic Health Records play a pivotal role in UDS reporting. In fact, successful UDS submissions often correlate with how effectively a health center leverages its EHR for data capture and reporting.
Here we explore the various ways in which the EHR impacts UDS reporting and how EHR–UDS integration can streamline the process:
Primary Source of Clinical Data
The EHR is where clinicians document patient visits, diagnoses, treatments, and outcomes, all of which feed into UDS.
- For example, when a provider enters a diagnosis of diabetes, and the patient’s A1c lab result into the EHR, that information later contributes to UDS.
- The accuracy of UDS clinical measures is directly tied to how consistently data is entered into structured fields in the EHR.
- If providers properly use problem lists, input vitals, and close charts with the correct visit types, the EHR can produce most of the needed counts automatically.
- A well-configured EHR can generate reports of how many hypertensive patients have blood pressure <140/90, or how many women got a Pap test, etc., matching the UDS definitions.
- Thus, the EHR is the engine for producing reliable UDS clinical data.
Built-in UDS Reporting Tools
Many EHR systems include UDS report templates or modules. These are pre-designed queries within the EHR that tally the UDS tables based on the data in the system.
- For instance, an EHR’s UDS module might generate Table 3A with one click, using registration data.
- It might also compile Table 6B measures by querying the clinical data. Using these tools can save significant time compared to manual counting.
- However, the EHR’s UDS logic must be up-to-date with HRSA’s requirements.
- Health IT teams should update to the latest UDS reporting package each year if the vendor provides it, since measure definitions can change.
- Even with these tools, validation is needed; the EHR may not capture every scenario.
- Still, EHR-provided UDS reports are an excellent starting point and reduce human error.
EHR Data Quality and Configuration
The role of the EHR isn’t just in reporting but in how it is used throughout the year. A key best practice is to configure the EHR templates and workflows in alignment with UDS data needs.
- For example, ensure that for each visit, the visit type or service is correctly categorized so that visits count in the right bucket on Table 5.
- Ensure that enabling service encounters are recorded so they aren’t missed.
- Train providers to use structured fields, e.g., entering smoking status in the discrete field rather than only in narrative, because UDS measures may rely on those fields.
- Many health centers conduct periodic data quality audits in the EHR to catch issues before UDS time.
Health IT Capabilities Reporting
Interestingly, UDS itself asks about the health center’s EHR and health IT capabilities. This includes whether the center has an ONC-certified EHR, uses electronic prescribing, participates in Health Information Exchange, etc.
The fact that HRSA collects this indicates how crucial EHRs are considered for quality reporting and care coordination. From a compliance standpoint, having a certified EHR is strongly encouraged.
UDS even tracks if the health center achieved Patient-Centered Medical Home recognition, which often ties into EHR use and quality improvement processes. All this to say, the EHR is not only the source of data but also a subject of reporting, reflecting HRSA’s push for health IT adoption in health centers.
EHR Integration with UDS+
As we will discuss later, HRSA’s UDS modernization is moving toward patient-level data submission via FHIR. In that model, an EHR could theoretically generate a UDS+ data file directly from its database to send to HRSA. This would represent an even tighter EHR-UDS integration, reducing manual steps.
Already, some health centers have tested exporting de-identified patient data from EHRs to meet UDS needs. So, investing in an EHR that can seamlessly extract data or connect via APIs will become even more beneficial.
Bridging Multiple Systems
Some health centers use more than one EHR. In such cases, integration is key. Many have turned to data warehouses or analytics platforms that aggregate data from multiple sources, including the EHRs, to produce unified UDS reports. The role of the EHR then extends to ensuring those integrations are properly mapping fields and that all systems speak to each other. Health Center Controlled Networks often assist with this kind of data integration for UDS.
UDS Tables Explained
The UDS report is composed of a series of standardized tables and forms.
To the uninitiated, these table names and numbers can be a bit cryptic. Here we break down each of the main UDS tables, explaining what each one is for and what data it contains:
Patients by ZIP Code
Often presented as the first table, this table lists the number of patients served, organized by their residence ZIP Code. It also typically shows the patients’ primary medical insurance source by ZIP. The purpose is to illustrate the health center’s service area and penetration.
Table 3A – Patients by Age and Sex Assigned at Birth
This demographic table shows the unduplicated patient count broken down by age groups and by sex assigned at birth. It gives a basic demographic profile of the patient population’s age and gender distribution. For example, one can see how many pediatric patients vs. geriatric patients were served.
Table 3B – Demographic Characteristics
This table details patient demographics such as race and ethnicity, language, and other characteristics. Specifically, it reports the number of patients by racial categories, by Hispanic/Latino ethnicity, and by language preference. Table 3B may also include data on agricultural worker status, veteran status, and housing status counts, depending on recent requirements. It provides a portrait of the cultural and social demographics of the patient population.
Table 4 – Selected Patient Characteristics
Table 4 focuses on socioeconomic characteristics and insurance. It shows patients by income level and by primary medical insurance type. Income categories typically include:
- Patients at 100% and below the poverty line
- 101-200% poverty, over 200%, and
- “unknown” income.
Insurance categories include uninsured, Medicaid/CHIP, Medicare, Other Public Insurance, and Private Insurance. Table 4 also captures the counts of patients in certain special populations served. In addition, Table 4 includes any data on managed care enrollment. The table is basically about patients’ economic and insurance profiles, which is important for understanding payor mix and poverty levels.
Table 5 – Staffing and Utilization
Table 5 is a dual-purpose table covering the staff FTEs and the service delivery volumes. The first part of Table 5 lists the full-time equivalent of personnel by category: e.g., physicians, nurse practitioners, physician assistants, certified nurse midwives, dentists, dental hygienists, mental health providers, enabling services staff, and so forth.
The second part of Table 5 then lists the number of patient visits and the number of patients for each corresponding service category. Essentially, Table 5 connects the workforce to outputs: it shows how many visits were generated by each type of service and with how many staff. It also includes a line for total patients and total visits. This table is key to productivity and staffing analysis.
Table 5 – Addendum (Selected Service Detail Addendum)
This addendum provides more granular detail on certain services. Specifically, it breaks out mental health and substance use disorder services by the type of provider delivering them.
For instance, it might show how many mental health visits were provided by psychiatrists vs. by clinical psychologists vs. by licensed clinical social workers, etc., and similarly for SUD services by medical providers vs. behavioral health providers. The addendum helps HRSA see the composition of the behavioral health services – important for understanding integration of behavioral health into primary care.
Table 6A – Selected Diagnoses and Services Rendered
Table 6A is a compilation of various diagnostic and service categories with counts of patients and visits in each. For example, it includes lines for: number of patients with hypertension, with diabetes, with asthma, etc., representing prevalent diagnoses. It also includes lines for certain services/procedures: e.g.,
- Number of Pap tests,
- NO. of immunizations,
- Number of well-child visits,
- Number of HIV tests,
- Prenatal care patients and deliveries, etc.
Each line typically has two columns – one for the number of patients who received that service or have that condition during the year, and one for the number of visits or services provided for that condition.
6A essentially measures service volume for key clinical areas. It’s used to gauge things like how many people with certain chronic conditions the health center is managing, and how intensively.
Table 6B – Quality of Care Measures
Table 6B contains the clinical quality measures that align with national quality indicators. Each measure usually has: a universe, a denominator, a numerator, and a calculated percentage.
- For example, for hypertension control, the number of hypertensive patients and the number with blood pressure <140/90 at the last reading, leading to the percentage controlled.
- Other measures include:
- Diabetes control,
- Cervical cancer screening,
- Colorectal cancer screening,
- First-trimester entry into prenatal care,
- Childhood immunization status,
- Weight assessment and
- Counseling for children and adolescents,
- Tobacco use screening and cessation intervention,
- Depression screening and follow-up, etc.
These measures reflect the quality of care and preventive health performance. Health centers are compared on these measures for quality awards and benchmarking. Table 6B is purely about the rate of meeting certain standards of care.
Table 7 – Health Outcomes and Disparities
Table 7 traditionally focused on a couple of outcome measures. In recent UDS, some outcome measures might have been merged with 6B or restructured.
But essentially, Table 7 looks at actual health outcomes and often asks for stratification by race/ethnicity to see if there are disparities. It helps HRSA and health centers identify gaps in outcomes among different patient groups, supporting the goal of reducing health disparities.
Table 8A – Financial Costs
Table 8A details the expenses of the health center. It is organized by cost centers and by type of cost. For each service category, it lists the direct personnel costs, direct non-personnel costs, and indirect costs allocated. Summing across gives the total costs of running the health center for the year.
This table shows how resources are spent and the cost structure of the health center. It’s also used to calculate things like cost per patient or cost per visit.
Table 9D – Patient Service Revenue:
This table reports all monies related to patient services. For each payer category, it lists:
- Full Charges: the total billable charges for services provided to patients of that payer.
- Adjustments/Sliding Discounts: the amount that was written off. It also includes bad debt as a separate line for self-pay.
- Net Collections: the actual amount received from that payer source.
Essentially, 9D shows how much revenue the health center generated and collected from each type of payer, highlighting the uncompensated care. - For example, one can see the difference between what was charged vs. collected for Medicaid, or how much sliding fee discount was given to uninsured patients. It’s a critical table for understanding financial viability and payer mix.
Table 9E – Other Revenue
Table 9E captures non-patient service revenue, which includes all other sources of funding. Key lines include:
- Federal grants (the main 330 grant, plus any other federal grants like HIV, COVID funding, etc.),
- State and local grants,
- Foundation or private grants,
- Donation and fundraising revenue,
- Other income (which could be things like interest, rental income, etc.).
9E essentially completes the financial picture by listing ho much money came in from grants and other sources outside of patient care activities.
Combined with 9D, it allows calculation of total revenue and comparison to expenses (8A). For example, a health center might show a federal grant of $1 million, state grants of $200k, etc., in 9E.
Step-by-Step Guide to Preparing and Submitting UDS Reporting
Preparing and submitting a UDS report is a complex project that benefits from a structured approach. Below is a step-by-step guide to the UDS reporting process, from preliminary planning through final submission. Following these steps can help ensure your UDS report is complete, accurate, and on-time:
Assemble Your UDS Team and Plan Early
Start by identifying a UDS lead coordinator and a cross-functional team well in advance of the reporting deadline. The UDS lead is often a quality director, data analyst, or compliance manager who will oversee the entire process.
Include representatives from clinical services, IT/EHR reporting, finance, billing, and HR. Begin planning in the fall (or earlier) for the year-end report. Schedule regular check-ins for the team.
Early planning ensures everyone knows their role in gathering data and avoids last-minute scrambling. Aim to have a timeline where an initial draft of the UDS data is ready for internal review at least five days before the February 15 due date.
Review UDS Instructions and Changes
Each year, HRSA releases a UDS Manual and a Program Assistance Letter outlining any changes in reporting requirements. Have the team thoroughly read the UDS Manual and PAL early on. Note new definitions, added or removed measures, and common definitions.
If there were changes, adjust your data collection approach accordingly. Understanding the instructions is critical – for example, how HRSA defines a “visit” or a “patient” for UDS may be specific.
Clarify any confusing points by reaching out to the UDS Support Center or asking during training webinars. Essentially, equip your team with knowledge of what exactly needs to be reported and how.
Collect and Prepare Data Throughout the Year
Don’t wait until December to gather data. Treat UDS as a year-round process. Throughout the calendar year:
- Ensure front-desk staff are updating patient demographics (race, ethnicity, income, insurance) regularly, since this feeds Tables 3B and 4.
- Providers and clinical staff should document visits and services consistently in the EHR (use correct service codes, enter all diagnoses, etc.).
- If possible, run UDS-related reports quarterly or monthly to monitor where you stand. For instance, check your quality measures from Table 6B monthly – this not only helps performance but ensures you know early if data looks off.
- Keep an eye on enabling services and other activities that might not be in the EHR (maybe maintain a simple log for outreach or enabling encounters).
- Finance can track sliding fee usage and other needed financial stats during the year.
By continuously collecting data and addressing gaps on the fly, you’ll have a much cleaner dataset at year’s end.
Extract Data from Systems After Year-End
Once the reporting year (Dec 31) closes, begin pulling together all the data. This usually happens in the first couple of weeks of January:
- IT/EHR reports: Generate all relevant reports from the EHR (demographics, visit counts, clinical measures, diagnosis counts, etc.). Many EHRs have a “UDS summary” report – run that, but also cross-verify key figures with custom queries if needed. Export lists of patients for certain measures if you need to validate numerator/denominator calculations.
- Billing/Financial reports: Finance should produce the final numbers for Tables 9D and 9E (charges, payments, etc.), and the trial balance for expenses to fill Table 8A. If the fiscal year matches the calendar year, use the annual financials; if not, use the calendar year slice of data. Make sure to separate out any non-health-center project finances if your org does more than the scope of the project.
- HR/Staffing info: HR provides the roster of staff and FTE status. Calculate FTEs for each category for Table 5. (Often, an Excel sheet is used where each staff member is listed with their role and FTE, then summed by category.)
- Other data: Gather any needed manual logs (like number of patients on MAT, number of outreach events or assistances for insurance enrollment, etc., for Appendix E).
This step is essentially data aggregation from all sources into one place (often the UDS Excel template provided by HRSA or your own workbook). Many health centers use an internal spreadsheet resembling the UDS tables to start plugging in the numbers.
Validate and Cross-Check the Data Internally
Before entering anything into HRSA’s system, perform a thorough internal review of the data for accuracy and consistency. Some key validation steps:
- Cross-table checks: Do the totals match?. Resolve any discrepancies now.
- Compare to the prior year’s UDS: Generate a comparison to last year’s submission for all major stats. Look for any large changes. Significant increases or drops should be explainable. For example, if patients increased by 25%, is that real growth, or did you change how you count? Investigate large variances for accuracy. HRSA expects year-to-year changes to be relatively modest unless there’s a clear reason. If a big change is legitimate, prepare a brief explanation for it.
- Check for completeness: Ensure no table is missing data. Every table should be marked as “complete”. If some cells have zero, and that makes sense, that’s fine, just ensure it’s not an accidental omission.
- Quality measures sanity check: Look at Table 6B percentages – do they make sense given what you know? If one measure is drastically different from last year or from expectations, double-check the denominator and numerator logic. Perhaps a data mapping issue in EHR could cause a numerator to drop out. It’s better to catch and fix that before submission than to explain it later.
- Patient lists spot-check: For critical measures or counts, some organizations do a quick chart audit or sample review. For example, if 6B is showing only 50% of diabetics had controlled A1c, maybe review a sample of diabetic patient charts to ensure the labs were properly pulled. Or if enabling services visits seem low, verify if some encounters weren’t recorded.
This internal QC step is crucial to avoid sending erroneous data. As one checklist item, HRSA suggests reviewing the prior year reviewer’s comments letter; often, the reviewer’s letter from last year will say what you needed to correct or watch out for, so you don’t repeat mistakes. Learn from those comments to ensure those issues are addressed in the current report.
Begin Data Entry into the EHB (or Upload Data)
With vetted numbers in hand, proceed to enter the data into HRSA’s Electronic Handbooks UDS reporting module. This can begin as early as January 1 when the system opens. Many fields will be manual entry by table cells. Some EHB features to use:
- The offline UDS Excel import: HRSA often provides an offline Excel file that can import some data. If you have populated that, you might be able to upload it to pre-fill tables. If not using that, just type in the numbers carefully.
- Save often, and mark tables as complete as you finish each.
- As you fill in data, the system will run validation checks. Pay attention to any real-time edit flags. Some edits are “must-correct,” and some are “warning/confirm” type. For example, if your total patients in 3A doesn’t match 3B, it will error. If your diabetic control percentage is extremely low compared to the national average, it might warn and ask for confirmation or explanation.
- Enter any necessary comments/explanations for data that trigger warnings. For instance, if you had an unusual year, you might note “COVID-19 impacts: clinic reduced operations, hence lower visit counts” in a comment field for context.
- Ensure that the person responsible for each section enters or reviews it.
Perform a Final Review in EHB and Address System Edits
Once all data is entered, use the EHB tools to validate the report. EHBs have a summary that shows which tables are complete and any outstanding edit checks. Go through each flagged edit:
- If it’s a discrepancy edit, fix the numbers or explain why not.
- If it’s an outlier edit, provide a clear explanation in the comment. For example, “Dental visits decreased 50% because our dental clinic was closed for renovations for 6 months in the year.” Avoid non-informative explanations; be specific about the impact and reason for variance.
- Ensure every edit either has a correction or a meaningful comment so that when the HRSA reviewer sees it, they understand the data is correct and why it might be unusual.
- Also, verify that all tables have the green check mark in EHB before submission.
Submit the UDS Report by February 15
When you are confident the report is accurate and all validations are addressed, it’s time for the final submission. Typically, a C-suite executive or project director must officially submit and certify the UDS report. This is effectively signing off that, to the best of your knowledge, the report is complete and accurate. Go into EHB, click “Mark UDS Report as Complete and Accurate” and then “Submit”.
Make sure this is done on or before Feb 15. After submission, download a PDF of the report or confirmation for your records.
Work with the UDS Reviewer During Review Period
After submission, be prepared to respond to HRSA UDS reviewer inquiries between mid-February and March. Check your email for any messages from an HRSA or BPHC contractor address; they often reach out via email with questions. Also, the EHB notification system may post comments on your submission.
Reviewers might ask for clarification on certain data or request that you double-check something. Respond promptly and cooperatively. If a correction is needed, the reviewer can reopen the report for you. Make the edit and resubmit the section as instructed.
Keep communications professional and clear – remember, the reviewer’s goal is to ensure the data is high quality. Provide any additional explanation or documentation they require within the timeframe given. Quick turnaround will help finalize things faster. By the end of March, all issues should be resolved.
Finalize and Lock the Report
By March 31, confirm that all corrections have been made and the report is once again in Complete status. The EHB system will lock it down after that date. HRSA will treat this as your final official data. At this stage, you might want to brief leadership or the board on the final numbers and any noteworthy changes, since the data often reflects organizational performance metrics.
Post-Submission Analysis and Preparation for Next Cycle
Although technically beyond submission, a best practice is to analyze your UDS report for insights and archiving:
- Conduct a “post-mortem” with your team: what went well, what was challenging in the data collection, and any improvements for next year’s process.
- Use the UDS data for internal strategic planning. For example, if UDS shows a low cancer screening rate, that becomes a quality improvement focus for the coming year. If enabling services seems under-documented, maybe implement better tracking.
- If you anticipate changes (like new sites or services) in the current year, start adjusting data systems now to capture those for the next UDS.
- Finally, store the final UDS submission, backup data, and any analysis in a secure location. This will be useful next year for comparisons and also in case of any audits or site visits by HRSA.
UDS Reporting Checklist (Audit-Ready)
To further assist in preparing an error-free UDS submission, here is a UDS Reporting Checklist of key items and best practices. This checklist is also geared toward making sure your UDS data would stand up to scrutiny in an audit or HRSA review. Use this as a final run-through before submitting:
| Checklist Area | Action / Validation Item | Key Notes & Validation Guidance |
| Planning & Governance | Start Early & Organize Team | Confirm a designated UDS preparation team is in place. Ensure the workflow and timeline begin well before year-end, not as a last-minute activity. |
| Timeline Management | Internal Deadline Before HRSA Deadline | Complete a full draft UDS report at least 5 days before Feb 15 to allow time for review, corrections, and approvals. |
| Historical Review | Review Last Year’s Reviewer Letter | Retrieve last year’s UDS reviewer comments. Identify recurring issues and confirm they are explicitly addressed in the current report. |
| Baseline Reference | Pull Final Prior Year UDS Report | Download the finalized prior-year UDS submission (including corrections) to support year-to-year comparisons. |
| Data Validation | Year-to-Year Comparison – Patient Counts | Compare Tables 3A, 3B, and ZIP for total patients and distribution changes. Flag unexplained variances. |
| Year-to-Year Comparison – Demographics & Insurance | Review Table 4 for shifts in payer mix, uninsured rate, or demographic composition. | |
| Year-to-Year Comparison – Visits & Staffing | Analyze Tables 5 and 6A for changes in visit volumes, staffing levels, or service mix. | |
| Year-to-Year Comparison – Clinical Measures | Review Tables 6B and 7 for denominator changes and performance percentage shifts. | |
| Year-to-Year Comparison – Financial Ratios | Validate Tables 8A, 9D, 9E for cost-to-revenue alignment and major financial variances. | |
| Variance Management | Explanations for Significant Changes | Provide clear, specific explanations for large spikes or drops (e.g., staffing vacancies, QI initiatives). Avoid vague or placeholder comments. |
| Cross-Table Integrity | Consistency Across Tables | Verify total unduplicated patients match across 3A, 3B, 4, and 5. Ensure clinical denominators logically align (e.g., 6A ≥ 6B). |
| Data Reconciliation | Confirm race, income, age, and other sub-totals reconcile to the total patient counts. Resolve discrepancies at the source. | |
| System Validation | Resolve Edit Flags | Clear or justify all EHB edit checks. No unresolved flags should remain before submission. |
| Form Completion | Table Completion Status | Confirm all required tables and forms show as “Complete” in EHB. Verify that the Health IT and Other Data Elements forms are fully populated. |
| Executive Oversight | Leadership Review & Approval | Ensure the CEO, CFO, or authorized executive reviews key metrics and is comfortable attesting to data accuracy. |
| Submission Control | Official Submission Confirmation | Confirm EHB status shows “Submitted”, not just “Saved.” Retain submission confirmation or PDF copy. |
| Post-Submission Management | Reviewer Follow-up | Monitor EHB and email during the review period. Respond promptly to clarification requests and document any updates. |
| Final Certification | Final Attestation & Lock | Re-attest and re-submit after corrections. Ensure the report is officially marked final and locked. |
| Audit Readiness | Documentation for Audit | Maintain an audit binder or digital folder with the final report, source data, prior year report, and variance explanations. |
Common UDS Reporting Challenges
UDS reporting can be challenging even for experienced health centers.
Related: Top Compliance Mistakes in UDS Reporting (and How to Avoid Them)
There are several common pitfalls and difficulties that organizations encounter during the process. Being aware of these challenges can help you proactively avoid them or address them before they impact your submission:
Data Inconsistencies and Alignment Issues
One frequent issue is when numbers don’t align across tables – for example, the total patient count on the demographic tables does not match the clinical tables.
This often happens due to data siloing or different data sources being out of sync. It’s common to discover that different reports yield slightly different patient totals. This inconsistency can trigger errors and require time-consuming reconciliation. The challenge lies in ensuring all systems use the same definition of a “UDS patient” and that you have processes to unduplicate patients across systems.
Patient Count Pitfalls (Meeting Targets)
Many FQHCs have patient targets in their grant requirements. A challenge can occur if your UDS patient count doesn’t align with those targets.
- For instance, if an FQHC is supposed to serve 10,000 patients but UDS shows only 8,000, it raises concerns.
- Sometimes the issue is under-reporting rather than actual performance.
- For example, if the health center failed to count certain outreach or telehealth encounters that should count as visits, they might be undercounting unique patients.
- Misidentifying what constitutes a patient or a visit can lead to targets being missed on paper, even if services were provided.
- This is both a compliance and a programmatic challenge.
Related: How FQHCs Can Modernize Tech to Meet UDS+ and HRSA Compliance
Misclassified or Incomplete Staffing Data
UDS Table 5 requires categorizing every staff FTE properly, which can be tricky. A common challenge is misclassifying staff roles or FTEs.
- For instance, if a nurse practitioner is accidentally listed under “Physicians” or a case manager is omitted from enabling services count, the staffing mix appears wrong.
- Additionally, sometimes FTE data is incomplete if there are part-time or contracted staff who are missed.
- An incorrect FTE count can then throw off productivity metrics, making them look abnormally high or low.
- This often triggers reviewer questions, especially if costs don’t align with FTEs – e.g., high salaries reported but low FTE or vice versa.
- Ensuring HR data is up-to-date and matches UDS categories is a challenge every year.
Clinical Measure Data Gaps
UDS clinical quality measures rely on complete and accurate clinical documentation.
Common problems include missing data for key elements. If EHR workflows were not strictly followed, those measures can underreport performance.
Another challenge is if the clinical definitions change year to year – keeping up with those changes to properly query EHR data can trip up teams. Some centers find that their EHR reports don’t fully match UDS specs, requiring manual adjustments. All these can result in clinical measures appearing worse or odd, prompting questions or affecting quality awards.
Undercounting Enabling and Support Services
Enabling services and other support services are crucial to FQHCs but often under-documented. A challenge is that these services might not be scheduled or billed in the EHR, so they can be easily left out of UDS counts. For example, if health educators see patients but those interactions aren’t logged as encounters, UDS Table 5 might show zero enabling services visits when in reality many took place.
Similarly, behavioral health consultations or phone follow-ups might get missed if not captured as encounters. This leads to undercounting the scope of services, which not only diminishes the reported impact but also might raise red flags if, say, you have enabling services staff on payroll but report no enabling visits. The challenge is setting up systems to record all billable and non-billable touches.
Revenue Reporting Complexities
Financial data can be challenging to align with UDS categories. A typical pitfall is missing some revenue in Table 9D or 9E, for instance, not separating pharmacy income, or not counting lab income properly, or misclassifying a grant. In some cases, health centers forget to include things like COVID-specific funding or report it incorrectly if guidance was updated.
Another challenge is accrual vs. cash basis: UDS has historically been on a modified accrual basis, and if a health center’s bookkeeping is cash-based, reconciling that for UDS can be confusing.
Mistakes in 9D/9E can cause significant issues; a common one is where collections + adjustments don’t equal charges, indicating an error. Or forgetting to report the value of donated services/supplies if required. These complexities make the finance part prone to error, especially if finance staff are not intimately familiar with UDS definitions.
Short Turnaround Time / Resource Strain
Practically, one of the biggest challenges is simply the short window to compile and submit everything, which can strain staff. Many health centers have lean teams, and pulling people from daily duties to focus on reporting can be tough.
If a key staff member is on leave in January or if there’s turnover, knowledge gaps appear. This time pressure sometimes causes errors due to rushing. And if the UDS reviewer then requires significant changes in March, it can collide with other work. So the timeline itself is a challenge every year.
Understanding and Keeping Up with Changing Requirements
UDS is not static; each year, HRSA might implement new elements, retire others, or refine definitions. Health centers often struggle to keep up with these changes.
- For example, the introduction of Sexual Orientation and Gender Identity data collection was confusing.
- In 2025, data SOGI reporting was optional and was to be eliminated in 2026.
- Centers might either over-report or under-report such data due to uncertainty.
- Similarly, the announcement of future big changes can create confusion about what to do now.
- It’s challenging to ensure all staff are aware of and trained on the new definitions in time for data collection.
- Without deliberate effort, centers might find out at year-end that they misunderstood a requirement, forcing retrospective corrections or accepting a gap.
Data Integration for Multi-Site or Migrated Data
Health centers that have multiple sites or that migrated to a new EHR during the year face a special challenge: data consolidation. If an FQHC merged with another, or took on a new site that previously had its own systems, combining those datasets for one UDS is a heavy lift.
Similarly, if you switched EHR in mid-year, half your data might be in one system and half in another – making it tough to generate one consistent report. This can lead to incomplete data if not carefully merged.
Related: UDS Reporting for Multi-Site Health Centers: Data Consolidation Challenges
Human Error and Staff Turnover
Finally, an overarching challenge is the risk of human error. UDS reporting involves manual steps and judgment calls. Typos in data entry can skew numbers.
If the person who knew how a particular data point was derived leaves the organization, the new staff might not know where to get it. Institutional knowledge about UDS needs to be maintained, which can be hard with turnover. Training new staff on UDS intricacies is an often-cited challenge among health centers.
UDS Reporting Risks and Compliance Exposure
UDS reporting is not just a data exercise; it carries significant compliance implications and risks for health centers. Failure to report accurately and on time can expose a health center to various negative consequences. Here we outline the key risks and compliance issues tied to UDS reporting:
Non-Compliance with Federal Requirements
Submitting the UDS report is a condition of the federal grant for Health Center Program awardees.
- As per HRSA requirements, health centers must have systems to compile required data and must submit timely, accurate, and complete UDS reports.
- Not meeting these obligations is considered non-compliance.
- In practical terms, if a health center fails to submit by Feb 15 or submits obviously incomplete data, HRSA can issue a Condition on the grant award.
- This could lead to a requirement for a corrective action plan, closer monitoring, and, in extreme cases, suspension of funding.
- The compliance manual explicitly lists timely and accurate UDS submission as an expectation, so health centers treat it very seriously.
Risk of Funding Impact
UDS data directly influences funding in several ways. First, as noted, it’s the basis for HRSA justifying continued funding to Congress. If UDS data suggest a health center is underperforming, it might jeopardize future grant renewal or trigger technical assistance interventions.
Moreover, HRSA uses UDS data to set patient targets and other programmatic expectations; missing those targets repeatedly could result in funding consequences. On a more immediate level, UDS data is used to calculate Quality Improvement Awards. These are additional grant dollars given to high performers. If your data is inaccurate, you might lose out on the award money you deserve.
Conversely, if data is inflated inaccurately and you get an award, an audit could force payback. Financially, UDS can also influence Medi-Cal PPS rate or other reimbursements because it establishes volume and scope; incorrect data might undervalue your operations and lock in lower rates.
Reputational Risk
Submitting erroneous data can skew the portrayal of the health center’s impact, which has reputational implications. For example, if, due to an internal error, you report only half of your actual patient count, you appear to be under-serving your community.
This could lead local stakeholders or boards to question performance. HRSA and the public see UDS results, so mistakes are visible. If quality measures are reported as low due to data issues, it might wrongly tarnish the health center’s quality reputation among peers.
There’s also a risk of public scrutiny: UDS data is public, and media or researchers sometimes analyze it. One wouldn’t want an inaccurate outlier in data drawing negative attention. Thus, protecting the accuracy of UDS protects the health center’s public image of effectiveness.
Financial Audit and Internal Control Risks
The financial figures in UDS are often compared to audited financial statements. If there’s a large discrepancy, it raises questions about the health center’s financial reporting controls. While UDS itself is not a financial audit, inconsistent reporting might prompt auditors or HRSA to dig deeper.
- For instance, if UDS shows a break-even or surplus but audited statements show a deficit, something is off.
- Misreporting grant revenues or expenses can have compliance ramifications, since federal funds must be reported accurately.
- In extreme cases, intentional misreporting could be construed as fraud, though most UDS errors are not intentional; a pattern of inaccuracies might draw unwanted scrutiny from oversight bodies.
Site Visit and Conditions Risk
HRSA conducts Operational Site Visits for health centers every few years. During an OSV, one element reviewed is the health center’s compliance with data reporting requirements. Site visit teams often look at UDS data and may ask the health center to explain certain figures or trends.
If the site visit team finds that the health center did not have proper systems for UDS or the data seems unreliable, they can issue a finding. That might result in a condition that the health center must develop better data systems. They might even spot-check a few data points.
Being unprepared to justify your UDS data under examination is a compliance risk. The health center must be able to demonstrate how UDS data is collected and validated during such reviews.
Loss of Opportunities
Beyond compliance, inaccurate UDS data can cause a health center to miss opportunities or face penalties in programs linked to performance.
- For example, some state funding allocations or Medicaid incentive programs might use UDS metrics.
- If you under-report on those metrics, you might lose incentive payments.
- Additionally, things like the national 340B drug pricing program eligibility are tied to being an FQHC; while UDS data itself doesn’t affect 340B, if one lost FQHC status due to non-compliance, that would be catastrophic.
- So there’s a cascade – UDS non-compliance could trigger a chain reaction affecting many benefits the health center relies on.
HRSA Watchlist or Intervention
Consistently poor or inconsistent UDS reporting can put a health center on HRSA’s radar for potential intervention or technical assistance.
For instance, if year after year the center has to reopen UDS multiple times or has major errors caught by reviewers, HRSA may mandate assistance from a Primary Care Association or other consultant to improve data capabilities. While help is good, being in that situation indicates the health center has a compliance weakness. It’s preferable to demonstrate self-sufficiency in reporting accurately.
Operational and Clinical Risks
UDS data, when accurate, is a tool for the health center’s own management. If the data is wrong, management might make misguided decisions.
For example, if you under-report visits, you might not realize a service line is actually quite busy and needs expansion. If you over-report a quality metric mistakenly, you might not invest in needed improvement, thinking you’re doing fine. Thus, inaccurate data is a risk to actual performance improvement efforts and resource allocation internally. Compliance exposure aside, it can lead to misinforming leadership or the board, which is a governance risk.
Best Practices for Reducing UDS Reporting Risk
To mitigate the challenges and risks discussed, health centers have developed numerous best practices for UDS reporting. Implementing these can greatly enhance the accuracy of your data and the efficiency of your reporting process, while keeping you in compliance. Here are some proven best practices:
Treat UDS as a Year-Round Activity
Don’t silo UDS as a once-a-year scramble. The most successful health centers integrate UDS reporting into their ongoing operations. This means monitoring key UDS metrics monthly or quarterly rather than discovering surprises in January. By embedding UDS data checks into regular QI meetings or financial reviews, you normalize the data collection.
For example, if each quarter you pull a UDS-like report and check progress toward targets, the end-of-year compilation will be straightforward, and you can catch anomalies early. Year-round attention also spreads the workload and reduces stress on staff.
Establish a Clear Ownership and Coordination
Assign a single UDS Coordinator who has the authority and oversight across departments. This person ensures accountability; they follow up with each data owner and keep the process on track.
Importantly, give this coordinator cross-department visibility or even authority; for instance, they should be empowered to request data or corrections from any department. A coordinator with support from leadership can cut through potential silos and ensure a unified report. Organizations that assign UDS duties by committee without clear leadership often struggle; a single point person works best.
Invest in Training and Knowledge Retention
UDS definitions and expectations should not be arcane knowledge held by one guru. Spread the knowledge through training sessions for relevant staff.
- For instance, train front desk and data entry staff on the importance of capturing race/ethnicity and income data properly.
- Train providers and nurses on what counts as a “UDS visit” and the documentation needed. Provide an annual refresh for staff on any changes to UDS measures.
- Also, make use of HRSA’s UDS training resources: webinars, TA calls, and UDS manuals.
- Identify UDS training needs for new staff or new managers ahead of time. Document your internal procedures for UDS prep, so if key staff leave, the knowledge isn’t lost.
Use Tools and Technology to Your Advantage
Leverage the UDS reporting tools available to you. If your EHR has a UDS report module, use it and collaborate with your vendor or HCCN to ensure it’s updated to current specs. Consider using a data analytics platform if you have multiple data sources; some networks provide tools that automatically produce UDS tables by aggregating EHR, pharmacy, and billing data. Use the HRSA Preliminary Reporting Environment each fall to test-run your data in the system and catch validation issues early.
Utilize the UDS test tools or Excel templates provided; for example, HRSA’s official Excel can do some auto-calculations and highlight discrepancies. Some health centers even use custom scripts or software to cross-check UDS. Embracing these tools reduces manual errors. Additionally, create internal dashboards for key UDS metrics so that leadership can track performance.
Perform Data Audits and Validations Before Submission
We can’t overstate the importance of internal QA. Develop an internal “mini-audit” process of your UDS data prior to submission. For example, randomly sample a few patient records and trace them through the UDS tables: pick one patient and see if their demographics, diagnoses, and visits are correctly represented in each relevant table.
Or audit one measure by manually checking a list of prenatal patients and verifying that it matches what you reported. Some centers do a peer review: have someone not directly compiling the data review the report for glaring issues or ask challenging questions. This simulates the HRSA reviewer and prepares you to answer or correct things. Essentially, try to catch errors before HRSA does.
Maintain Documentation and Data Trail
Keep a well-organized audit file of how each number was derived. This includes listing queries used, saving copies of source reports, and notes on any adjustments made. Having this documentation serves two purposes:
- It helps if you need to explain or adjust something during HRSA review, and
- It is invaluable for new staff next year to understand how data was gathered.
It’s also an internal control, demonstrating that figures weren’t just made up but have backup. If HRSA or auditors ever ask, “How did you get this number?”, you can readily show them. Internally, treat this like preparing tax work papers for an IRS return.
Engage Leadership and Board in UDS Data
UDS shouldn’t live only at the staff level. Regularly report UDS metrics to leadership and the board, not just after submission but throughout the year. When the C-suite and board are familiar with the UDS key performance indicators, they become allies in ensuring data is good and used for improvement.
For example, present a UDS scorecard to the board quarterly – “here’s our patient count progress, our quality measure trends, our financial ratio from a UDS perspective.”
This not only keeps these metrics in focus but also means that at year-end, there are no surprises for leadership. It also underscores to the whole organization that UDS data matters and is being watched at the highest levels, which can motivate diligence in data entry and quality.
Proactive Communication with HRSA or PCA
If you foresee an issue, don’t wait for HRSA to find it. For instance, if a disaster or EHR failure caused data loss, inform your Project Officer or the UDS Helpdesk proactively and seek guidance.
They may grant an extension or advise how to annotate the report. Utilize your state/regional Primary Care Association or Health Center Controlled Network resources, which often run UDS trainings, user groups, and can review your data ahead of time for reasonableness. Peers in those networks might share solutions to common problems. Essentially, don’t hesitate to ask for help; it’s better to clarify a definition or method in advance than to guess and be wrong.
Implement Lessons Learned Each Year
After each UDS cycle, hold a debrief meeting with your team. Document what went well and what issues arose. Maybe you discovered that enabling service visits was undercounted; plan a fix for the new year.
Maybe the EHR report was inaccurate for one measure; work with IT to correct that for next time. Over the years, continually refine your data collection processes.
- For example, one health center might realize, “We always have trouble with income data – let’s implement an annual patient survey to update income info and train staff to get that.”
- Another might realize “We need to tag telehealth visits differently.”
- Make those changes early in the new year so that by next UDS, they are resolved. Continuous improvement in data processes is a hallmark of reducing UDS headaches.
Use UDS Data for Strategic Improvement
Finally, one of the best practices that indirectly reduces risk is to actually use your UDS data beyond reporting. When staff see that the data is being used to drive decisions, they appreciate its value and are more careful in data handling.
Using UDS data as a strategic tool closes the loop; it’s not just data for HRSA, it’s data for us. This mindset leads to better data quality because people want an accurate reflection of their work to inform improvements. It also means that something looks off; clinicians or managers might notice and correct it even before the reporting period.
UDS Reporting Tools and Technology Considerations
Leveraging the right tools and technology can make a significant difference in the ease and accuracy of UDS reporting.
Given the data-intensive nature of UDS, health centers should consider various tech solutions and system setups to streamline the process. Here are key tools and considerations:
HRSA Electronic Handbooks and UDS Module
At a basic level, all health centers will use the HRSA EHB online portal to submit UDS. Ensure that your EHB access is up-to-date, and that the staff responsible have the correct roles. The EHB UDS module comes with built-in features:
- Automated calculations for some totals,
- Real-time edit checks, and
- A comparison tool.
Train your team on using these features – for instance, after data entry, run the EHB comparison tool to see year-over-year differences easily, which helps catch anomalies.
Also, the EHB allows multiple users to work on different tables simultaneously, so coordinate to use that capability. Understanding the EHB platform’s capabilities is fundamental.
UDS Manual and TA Resources
While not “technology” in the digital sense, the UDS Manual PDF is an essential tool – treat it as your primary reference and have it handy.
Additionally, HRSA’s UDS Technical Assistance webpage and FAQs should be considered part of your toolkit. They often post Q&A clarifications, which can guide how you use your systems to capture data. For example, if an FAQ clarifies that telehealth visits count in certain lines, you may adjust your EHR setup accordingly.
EHR and Practice Management Systems
Your Electronic Health Record and practice management software are arguably the most important tools. Specifically, check if your EHR provides a UDS reporting package or template.
Many vendors release annual UDS updates to their reporting modules to align with the latest specs. Use these to generate preliminary numbers. Be aware of how your EHR defines certain measures versus HRSA; sometimes a little configuration or mapping is needed.
Engage your EHR vendor’s support or user community if you encounter a discrepancy. Some EHRs might allow extraction of a UDS-specific data file. At minimum, ensure you have reporting proficiency with your EHR: know how to pull lists of patients with certain conditions, how to count visits by provider, etc. If your EHR has a built-in UDS dashboard, consider monitoring it throughout the year.
Related: How to Automate UDS Reporting with EHR & Analytics Tools
Data Analytics and Reporting Tools
Many FQHCs invest in third-party data analytics solutions or are part of a Health Center Controlled Network that provides one. Tools like Azara DRVS, i2iTracks, Tableau dashboards, Power BI, or custom SQL databases can dramatically ease UDS prep.
- For example, Azara DRVS has a UDS module that can produce each table and even list the patients behind each number for validation.
- These tools often consolidate data from multiple sources into one warehouse and apply UDS logic uniformly.
- The advantage is more robust data validation and the ability to drill down on any figure that looks off.
- If your health center uses such a platform, integrate it fully into your UDS workflow, e.g., run the preliminary UDS report from the platform and verify it against EHB or vice versa.
- The investment in integration upfront pays off with automated UDS outputs.
- Smaller centers without such tools might rely on Excel heavily; while Excel can work for summing and comparing, be cautious with manual Excel operations, as they can introduce errors if formulas go wrong.
- Still, some use Excel as a tool, e.g., input raw data extracts into Excel pivot tables that mirror UDS tables.
UDS Import/Export Capabilities
Check if HRSA provides any import capability for UDS data. In recent years, HRSA has experimented with allowing an XML or Excel import to pre-populate the UDS tables.
If available, this can save time and reduce transcription errors. On the export side, after submission, you can export your UDS report to PDF or Excel, and keep that file. Some centers also export data from EHB to use in their own systems.
Preliminary Reporting Environment
We mentioned it in the timeline, but as a tool, HRSA’s PRE in late October allows you to upload or enter data into a test UDS environment. Use it as a dry-run for your tools. For instance, you could take your October YTD data, estimate the full year, enter it in PRE, and see what edits fire. This can highlight needed fixes in your data extraction programs. It’s a safe space since it’s not the official submission. It also lets you practice the process.
Version Control and Change Tracking
Use technology to maintain version control on your data files. For example, if you use spreadsheets, clearly version them. This helps if something goes awry; you can backtrack. Similarly, keep an archive of how calculations were done. Some advanced users put their UDS workflows into code.
If you have an analyst who can do that, it dramatically reduces manual errors and makes rerunning with updated data easier. Even if not coding, consider using built-in audit trail features of systems: many EHRs log reports or have query libraries, so you know exactly which query was used for which data.
Security and Privacy Considerations
UDS data is aggregate and de-identified, but still treat your data handling securely. If you are moving data between systems, ensure those files are in secure drives and properly deleted afterwards if they contain any PHI. Use secure communication if sending data for external TA review.
The data integration for UDS often involves PHI at intermediate steps, so maintain HIPAA compliance in your data workflows. Also, in EHB, ensure only authorized staff have access to UDS reports, as they contain sensitive operational info.
Future-Proofing with UDS+ Technology
As HRSA moves towards UDS+ in the coming years, consider the technology needed for that. UDS+ will rely on FHIR API connections from EHRs to HRSA. While widespread adoption has been delayed, getting your IT team knowledgeable about FHIR and your vendor’s capabilities is forward-thinking. Some EHRs may offer a “UDS+ export” functionality.
Participating in pilot tests could give you a leg up. In any case, aligning your data systems now to capture all needed data at the patient level will make any future tech transition smoother. Essentially, keep an eye on HRSA’s UDS Modernization updates so your tech investments align with where reporting is going.
Dedicated UDS Software/Services
There are also consultant firms and software that specialize in UDS reporting support. For example, some companies offer “UDS reporting services” with software that creates submission-ready files and checklists. They often build data pipelines from your systems and handle the aggregation for you.
If your health center has particularly complex data sources or limited internal capacity, contracting such services can reduce burden (though at a cost). Even with these, you remain responsible for final verification, but the heavy lifting of crunching numbers is assisted by technology.
Preparing for HRSA Reviews and Audits
Once the UDS report is submitted, health centers should be prepared for potential HRSA reviews or audits of their data and processes.
HRSA oversight can come in a few forms: the routine UDS review, Operational Site Visits, which include verifying data systems, or, less commonly a specific data audit if something triggers concern. Here’s how to be ready for these reviews:
Maintain Documentation and Justifications
As emphasized earlier, keep a robust documentation file for your UDS submission.
In preparation for a review or audit, organize this information so it’s easy to present. For every table or major data point, have the supporting detail ready.
Example: If an OSV team asks, “How did you get 12,000 patients in UDS?”, you should be able to show a report or printout from your EHR listing those 12,000 patients (de-identified is fine) and describe the process of unduplicating them.
Similarly, if asked about a clinical measure, you might show an EHR query or quality report backing it up. By having these on hand, you can quickly satisfy auditors. Audit-ready means nothing is a black box – you can open the box and show what’s inside for any number you reported.
Know Your UDS Submission Inside-Out
The staff who will interface with reviewers should have a strong grasp on the content of the UDS report. This means reviewing the final submission and understanding the story it tells.
Be ready to answer questions like:
- “What factors drove the increase in patients from last year?”;
- “Why did your cost per patient rise?”;
- “How are you addressing your uncontrolled diabetes rate of 30%?”
These kinds of questions often come up in HRSA discussions. If you have narrative explanations already (perhaps submitted as comments or internally noted), use those to respond. It’s wise to brief your CEO/Project Director on key points so they’re not caught off guard if asked.
Demonstrate Your Data Collection System
During an OSV or other review, HRSA will likely assess compliance with the requirement that you have a system in place to collect and organize required data.
You should be prepared to demonstrate your data systems. This could include a walkthrough of your EHR’s reporting capability, showing how you track things like service utilization. Auditors might sample one element, for example, they might pick a certain UDS measure and ask to see the source.
Or they could ask, “Show us how you track enabling services.” Being able to navigate your EHR or data system in front of them and produce the info is convincing evidence.
Additionally, have your written policies or procedures available: some health centers have a Data Reporting Policy stating roles, timelines, and QC steps for UDS. This can be shown to reviewers to prove that you approach it systematically.
Be Ready to Explain Any Odd Data
HRSA reviewers will have your UDS data in hand, and they often come with specific questions on outliers or unusual trends. Prepare in advance explanations for any metrics that stand out.
- For example, if your health center has a far higher behavioral health visit count than peers, explain your integrated model.
- If your grant reports show relatively few migrant health patients, explain the seasonal fluctuations or outreach efforts.
- In essence, anticipate questions by looking at your data critically.
- Also, be prepared to discuss steps you’re taking to improve in any areas where performance was suboptimal.
Ensure Board Awareness and Involvement
HRSA likes to see that the governing board is aware of and using UDS data. They might ask board members during an OSV about the health center’s trends in patient growth or quality outcomes. It’s good practice to have presented the latest UDS results to your board and documented that in minutes.
That way, if an auditor asks a board member, “How do you use UDS data?” the board member can say, for instance, “We review it annually; we noticed an increase in hypertension control and congratulated the staff, and we saw a need to invest more in dental services because visits were down.”
This demonstrates that UDS data isn’t just being filed away, but actively discussed in governance, which is something HRSA expects under program monitoring and data reporting requirements.
Be Honest and Upfront in Reviews
If during a review or audit, an issue is identified, it’s best to be candid and solution-focused. HRSA generally works with health centers to correct issues if they’re identified.
- For example, if an OSV finds that you calculated something wrong, they may recommend a corrective action to fix the process.
- Show that you take it seriously and outline how you will address it.
- Trying to cover up or being evasive would erode trust. In many cases, if it’s a minor data error, HRSA might ask for a corrected submission or note it and move on, especially if you have a plan to avoid recurrence.
Continuous Compliance Monitoring
HRSA’s compliance manual suggests health centers should continually monitor performance and outcomes data and use it for improvement.
Demonstrating this, continuous monitoring is key. So, beyond UDS, show that you generate reports on patient utilization, quality, etc., for internal use. Auditors may not ask for all that, but it creates an environment where data-driven decisions are the norm, which they appreciate. It can also tie into other requirements; UDS data is often part of those discussions, so it can satisfy multiple compliance areas.
External Audits or OIG
It’s rare but not impossible that external auditors could audit UDS data, especially if there were allegations of misreporting. The best defense is having thorough documentation and following all the rules. If you’ve double-checked and explained your data, there should be little to worry about.
It’s also prudent to keep historical records: keep copies of past UDS submissions and related correspondence for at least several years. That way, if a question arises about what was reported 3 years ago, you can refer to it.
Addressing UDS in Operational Site Visit Prep
When preparing for an OSV, pay close attention to the section on Program Monitoring and Data Reporting Systems. Ensure you can show compliance with each element:
- A system for program performance monitoring.
- The ability to produce data-based reports on utilization, patient population trends, etc., for internal use.
- That you submit reports in a timely manner.
- If there were any past UDS conditions or issues, have documentation on how they were resolved.
- Some OSV teams use a sampling methodology where they might cross-check a small sample of source documents against reported data to ensure consistency.
- However, being prepared for any spot-check reinforces confidence.
Showcasing Data Use for QI
Often, reviewers will be impressed if you can show how you use UDS for Quality Improvement initiatives. For example, if you had a low colorectal cancer screening rate in UDS, did you start a new screening project? You can share that story. Or how you are using UDS patient demographics to tailor outreach.
This paints a picture of an organization that not only reports data but actively uses it to improve care, which is ultimately HRSA’s goal. It can turn a potentially dry compliance conversation into a narrative of improvement and learning.
Future of UDS Reporting
UDS reporting has been a mainstay for decades, but it is not static. HRSA is actively working on modernizing and improving UDS to reduce burden and increase the usefulness of the data. Looking ahead, here are the key developments and trends that represent the future of UDS reporting:
Transition to UDS+
Perhaps the most significant change on the horizon is the move from aggregate data to patient-level data submission, often referred to as UDS+.
HRSA’s UDS Modernization Initiative intends to have health centers submit de-identified patient-level data through standardized electronic formats. This would be a paradigm shift: instead of filling out tables, health centers would upload a dataset containing each patient’s information, and HRSA would compute the aggregate tables itself. The rationale is to improve data quality, allow more flexible analyses, and reduce manual work at the health center level in the long run.
Pilot tests of UDS+ have already occurred; in 2023, a few health centers submitted patient-level UDS data as a test. HRSA had planned a broader UDS+ implementation for the 2024 reporting year, but it was postponed. Specifically, HRSA announced that health centers would not need to submit UDS+ for 2024 data as initially scheduled and instead only do the standard UDS by an extended deadline. This delay suggests HRSA is refining the process.
However, the initiative is very much alive, HRSA’s FHIR implementation guide for UDS+ is at version 2.0, and they are continuing testing. We can anticipate that in the next couple of years, UDS+ will become a requirement. Health centers should prepare by ensuring their EHR data capture is comprehensive and by working with vendors on capabilities to export data in required formats.
Related: UDS and UDS Plus: The Ultimate Guide to Healthcare Compliance and Data Reporting
Major UDS Content Changes (Effective 2026)
Along with format changes, HRSA periodically overhauls what data is collected. A significant restructuring is slated for the 2026 UDS reporting (data for calendar year 2026, reported in 2027), described as one of the largest in decades. Anticipated changes include:
- Elimination of managed care utilization reporting on Table 4: (likely removing lines about managed care patients/visits).
- Renaming/restructuring service categories on Table 5: This could mean different groupings of staff or visits.
- Significant changes to Table 6A diagnoses and services: Possibly updating diagnosis categories to current health priorities, adding or removing certain measures.
- Full redesign of Table 8A (Costs): Shifting to a new cost reporting structure, maybe to better capture value-based care activities or separate clinical vs enabling costs differently.
- Simplified Table 9D (Revenue): Perhaps combining lines or focusing on net revenue rather than gross and adjustments. This might accompany a move to accrual basis.
- A move from cash-based to accrual-based reporting: This is a big one; UDS revenues were somewhat cash-based. Moving to accrual would mean reporting revenues for services provided in the year, regardless of when cash comes. This aligns with standard accounting but will require procedural changes in how data is pulled.
These updates will require health centers to adapt quickly. It underscores the importance of reading each year’s Program Assistance Letter and training. The 2026 changes aim to streamline UDS and focus it on more relevant data.
Increased Integration with Electronic Data Sources
As UDS modernizes, we expect tighter integration with EHRs and other health IT systems.
If UDS+ becomes standard, health centers might have the option to use APIs to send data directly, reducing manual entry. ONC is involved in the UDS modernization, indicating a push to align UDS with national health IT standards. In the future, a health center could potentially click a button in its EHR that generates the UDS report dataset.
This will make having a certified, interoperable EHR even more critical. It also means health centers will need IT staff or support to manage these data submissions.
Real-Time or More Frequent Reporting
Currently, UDS is annual, which is a long lag. There have been discussions about moving to more frequent reporting once UDS+ is in place, enabling more timely data monitoring of health center trends.
We’re not there yet, but future UDS could allow HRSA to see emerging issues in closer to real-time. Health centers might have to adapt to a cadence of reporting data multiple times a year, though perhaps in exchange for simpler submissions.
Evolving Clinical Measures
The clinical quality measures in UDS will continue to evolve to reflect current clinical guidelines and priorities. For instance, we might see new measures around substance use treatment, social risk screenings, HIV prevention, or others as healthcare emphasis changes.
Measures that have topped out might be retired. Also, expect alignment with other federal programs: HRSA may align UDS measures with CMS MIPS quality measures or Healthy People objectives to streamline efforts. For example, if the blood pressure control definition changes nationally, UDS will change to match. Health centers will need to continually update their EHR build to keep up.
Social Determinants of Health (SDOH) Data
There’s an increasing focus on SDOH in healthcare. While UDS collects some related info, future UDS might incorporate more data on social needs and services. HRSA could add elements like “food insecurity screenings done” or outcomes related to enabling services.
Already, UDS started collecting enabling services data. As part of the push for health equity, HRSA might refine Table 7 to look more at outcome disparities or add new stratifiers. In any case, health centers should watch for expansion in data capturing health-related social needs.
Reduced Reporting Burden (Long-Term)
One goal of modernization is to actually reduce the burden on health centers. By automating the process and perhaps eliminating some less useful tables, HRSA intends to make UDS easier. In the near term, the transition will feel like more work, but ideally, in the future, health centers won’t be hand-aggregating dozens of tables. They might maintain a continuous data feed.
Another aspect is better data feedback from HRSA. We might see improved tools where health centers can log in and see interactive dashboards of their UDS data versus peers, etc., which can be motivators for quality improvement.
UDS and Value-Based Care Transition
As healthcare moves toward value-based care models, UDS may adapt to capture that. For example, more emphasis on outcomes vs. volume, tracking things like care coordination activities, or patient satisfaction.
If health centers start participating in ACOs or other models, HRSA might align some reporting to ensure UDS remains relevant in a value-based world. We might also see merging or cross-use of data with other reporting.
State and Other Stakeholder Access
Currently, UDS results are publicly available at aggregate levels. In the future, there may be more granular data sharing. For instance, state health departments could access certain de-identified patient-level UDS data for planning.
HRSA’s modernization might allow more dynamic querying of the national dataset to inform policy. For individual health centers, this means your data might be used in broader analyses more than before, emphasizing the need for accuracy because more eyes will be on it in new ways.
CapMinds UDS Reporting Service: Audit-Ready Submissions, Cleaner Data, Stronger Funding Outcomes
UDS reporting is not a “year-end task”; it’s an operational compliance system that depends on how your EHR, billing, HR, and finance data behave all year.
CapMinds helps FQHCs and Health Center Program organizations reduce reporting burden, improve data integrity, and submit audit-ready UDS packages with defensible documentation.
Our teams align your reporting logic to HRSA definitions, reconcile cross-table variances early, and strengthen the controls that prevent late-cycle surprises before the February 15 deadline.
CapMinds services aligned to UDS reporting include:
- UDS Reporting Services (planning, table prep, validations, reviewer response support)
- FQHC Analytics & Data Warehousing (single source of truth for UDS)
- EHR Reporting & Optimization Services (UDS-ready workflows, structured capture)
- Interoperability & Integration Services (EHR–billing–GL–HR data pipelines)
- Quality Measure Reporting & Population Health Services (6B/7 performance + drilldowns)
- Compliance Documentation & Audit Readiness Services and More
If you need predictable UDS outcomes, not last-minute firefighting, CapMinds can own the delivery.



