Families USA: The Voice for Health Care Consumers
    
Loading

Home

Tell Us Your Story

Sign Up

About Us

Action Center

Annual Conference

Donate

Contact Us



A Report from Families USA
December 1998

The Quality of Maryland and District of Columbia Medicaid Managed Care Plans: External Reviews



Table of Contents

EXTERNAL QUALITY REVIEWS FOR MARYLAND AND DISTRICT OF COLUMBIA MEDICAID MANAGED CARE PLANS

What Are Reviewers Asked To Examine?
A. Federal Guidelines
B. Maryland External Reviews
C. District of Columbia External Reviews

DIFFERENCES BETWEEN EXTERNAL REVIEWS IN MARYLAND AND IN THE DISTRICT

A. Questions Pertaining to Performance Standards and Systems Review
B. Questions Pertaining to Actual Care

Table: Questions Pertaining to Actual Care

C. What Data Are Publicly Available?

WHAT DID EXTERNAL REVIEWS SAY ABOUT MANAGED CARE PLANS?

I. MARYLAND RESULTS

Summary
A. Results from each Maryland Medicaid Managed care plan

Chart: Maryland 1997 Percentage Scores
FreeState Health Plan (FSHP)
Optimum Choice(OCI)
Prudential (PHCP)
Total Health Care (THC)
United HealthCare/Chesapeake (UHC)


II. DISTRICT OF COLUMBIA RESULTS

Summary

Chart 2: District of Columbia HMOs Ambulatory Record Screens
Chart 3: Medical Records Content Generic Screens
Results from each District of Columbia Medicaid Managed Care Plan
Chartered Health Plan (CHP)
DC Health Cooperative (DCHC)
George Washington University Health Plan (GWUHP)
Prudential Health Care Plan (PHCP)

CONCLUSION

END NOTES



EXTERNAL QUALITY REVIEWS FOR MARYLAND AND DISTRICT OF COLUMBIA MEDICAID MANAGED CARE PLANS

For consumers, advocates, and Medicaid administrators, external quality reviews can be an important source of objective information about the strengths and weaknesses of managed care plans. Under federal law and regulations, states must contract with an entity that is separate from both the state and its contracting managed care organizations to conduct an annual review of the quality of care that managed care organizations provide to Medicaid beneficiaries.1

Federal guidelines, issued by the Department of Health and Human Services in 1993, explain that external reviews should serve two purposes: (1) they should offer an independent assessment of the quality of care provided to Medicaid beneficiaries, and (2) they should help resolve the problems in health care identified by the review and thus improve quality. Currently, what external reviewers examine and how states use the results vary considerably from state to state. States define a "scope of work"2 in contracts with external review organizations. In some states, laws or regulations governing Medicaid managed care address quality reviews and allow sanctioning of plans that perform poorly. Under the Balanced Budget Act, the federal government is supposed to establish a protocol for external reviews in 1999.

Families USA, a national consumer health advocacy organization, examined the most recent external reviews available to the public in Maryland and the District of Columbia to answer the following questions:

What are reviewers asked to examine in each state?

What data is publicly available?

What were the results for each managed care organization last year?

How do managed care organizations vary with respect to quality?

How well does each jurisdiction follow up to remedy problems?

What lessons can be learned for quality oversight? Does each jurisdiction's "scope of work" produce useful results?

Both Maryland and the District of Columbia contract with Delmarva Foundation, a peer review organization, to conduct external reviews of Medicaid managed care plans. For this study, Families USA examined the Fiscal Year 1997 reviews for Maryland Medicaid managed care plans, which covered calendar year 1996; and the Fiscal Year 1995/1996 reviews for District Medicaid managed care plans, which covered the period from September 1994 through August 1996.3

What Are Reviewers Asked To Examine?

A. Federal Guidelines

Existing federal guidelines recommend three types of external review activities. First, reviewers should undertake focused "pattern of care" studies, investigating care in specific clinical areas (e.g., pregnancy, immunizations, asthma) or health care delivery areas (e.g., access to care, continuity of care, health education). Reviewers pull records relevant to a study topic and compare the documented care to practice guidelines and quality of care indicators. Second, reviewers should study some individual cases of serious but rare clinical incidents. For instance, reviewers might examine all childhood deaths to see whether the deaths could have been prevented. Third, reviewers should explain what managed care organizations should do to improve their performance. Then, on subsequent reviews, external quality review organizations should determine whether the managed care plans implemented corrective action plans and whether care improved as a result.

States have the option of using either their own agency personnel or contracting with external review organizations to monitor other aspects of Medicaid managed care. For instance, federal regulations require state Medicaid agencies to monitor utilization, disenrollment, and managed care organizations' grievance and quality assurance procedures.4 Maryland and the District of Columbia are among states that contract with external review organizations to carry out this monitoring.

B. Maryland External Reviews

In Maryland, Delmarva divides its review into two sections: performance standards and focused studies. The performance standards section of the review examines policies and procedures of the managed care organization for quality assurance, utilization review, enrollee rights, enrollment and disenrollment, patient satisfaction, health education, medical records, and data systems. Areas for Fiscal Year 1997 focused studies areas were ambulatory care, continuity of care, hypertension, immunization, mental health, new enrollees, pregnancy, and re-review. In each of these areas, reviewers specifically sampled relevant records from each Medicaid HMO. For example, reviewers sampled records of patients with hypertension from each HMO to determine whether each HMO provided adequate hypertension care generally.

In both the performance standards and focused review sections, reviewers assess compliance with a number of specific elements and give HMOs a numerical score. For FY 1997, reviewers considered scores below 70 percent to be areas of concern requiring corrective action.5 When HMOs receive a score of 100 percent compliance on a performance standard area, reviewers wait two years to re-review that area and assume a 100 percent compliance score in the year between reviews

C. District of Columbia External Reviews

In 1996, the District contracted with Delmarva to review care provided by most of its Medicaid managed care plans in 1995 and 1996. Most of the managed care plans started operations in 1994 or 1995, and the District allowed them some start-up time before undertaking the first review. One of the managed care plans, Chartered Health Plan, had served District Medicaid beneficiaries for many years. Since Chartered had been reviewed in 1995, its 1996 external review encompassed care in 1996 alone. Delmarva's review of District HMOs is broken into three parts. It includes a systems review, reviews of utilization statistics, and an ambulatory medical records review.

The systems review, like the performance standards review performed in Maryland, assesses the managed care organization's quality assurance program, peer review (or utilization review) plan, patient/enrollee rights, patient satisfaction program, health education plan, medical records system, enrollment (policy, plan, and member services procedure manual), and disenrollment program. It uses standards outlined in a federal Health Care Finance Administration guide. Rather than giving HMOs numerical scores on items, reviewers note whether standards were "met" or "not met." External review summaries indicate whether in a given area, "all performances standards were met," "the majority of performance standards were met," "the majority of performance standards were not met," or "none of the performance standards were met."

The review of utilization statistics examines catastrophic cases and the length of stay for each; encounter, prevalence, and incidence rates for enrollees with hypertension and enrollees with diabetes; rates of readmission within 30 days of a hospital discharge; and immunization rates for children. The District did not make available this portion of the external review report.

The ambulatory medical records review encompassed the following areas: utilization, deaths, ambulatory care, prenatal care, treatment of hypertension, treatment of diabetes, and medical record documentation. Delmarva conducted an ambulatory medical records review by sampling 150 medical records from each HMO – 75 from 1995 and 75 from 1996. (Chartered Health Plan was reviewed in 1995, so the 1996 review examines only the 75 records from 1996.) To obtain the sample from each HMO, Delmarva examined records from all patients who died during the review period and randomly drew remaining records from Primary Care Physician sites that served the largest number of Medicaid patients. Half of the patient records sampled were for children. Reviewers also attempted to sample all patients with hypertension and diabetes to determine whether care adhered to protocols for these groups. However, because HMO data systems could not readily identify all patients with a given diagnosis and because the total number of records sampled in each HMO was small, the resulting samples of diabetics and hypertensives did not permit overall conclusions about diabetes or hypertension care.

The ambulatory medical records review is divided into thirteen "generic screens." These are factors which Delmarva examined concerning the quality of ambulatory care, such as adequacy of diagnosis, necessary diagnostic procedures done appropriately, use of appropriate treatment modalities, adherence to pediatric periodicity schedules, etc. Delmarva gave each District Medicaid HMO percentage scores on each generic screen in the ambulatory medical records review. For example, if 134 medical records were sampled in a particular HMO, and 123 showed proper diagnostic procedures, the HMO scored 92 percent on the ambulatory care screen "necessary diagnostic procedures done appropriately." When plans scored below 70 percent on a particular screen, they had to submit a corrective action plan. Since not enough records were reviewed to draw conclusions about hypertension or diabetes care, any quality concerns about hypertension or diabetes cases were reflected under generic ambulatory care screens, such as "necessary diagnostic procedures done appropriately" or "use of effective treatment modalities."

DIFFERENCES BETWEEN EXTERNAL REVIEWS IN MARYLAND AND IN THE DISTRICT

There are three major differences between external reviews in Maryland and in the District. First, through regulations and contracts, Maryland sets many more specific performance standards for Medicaid managed care organizations than does the District, so the external review in Maryland addresses compliance with specific standards. For example, in Maryland, Delmarva reviews whether initial health assessments were scheduled in a timely manner and whether HMOs and their providers followed-up with patients who missed appointments because Maryland's Medicaid managed care contracts and regulations specifically require plans to do these things. The District's regulations and contracts with managed care organizations during the period under review did not. Second, Delmarva sampled a much larger number of records in Maryland than it did in the District in 1996 and selected samples that allowed comparisons between HMOs on care to various subpopulations. In the District, for fiscal years 1995 and 1996, Delmarva's review of ambulatory care records included a total of 450 records, 75 records from each plan for FY 95 and 75 for FY 96. Due to this small sample size, Delmarva could not reach any conclusions about whether hospital discharges and prenatal care were appropriate. In the managed care plan with the largest number of District Medicaid enrollees, Delmarva could not determine whether well-child visits adhered to recommended schedules. In Maryland, Delmarva reviewed a total of 2500 records for FY 1997, sampling between 400 and 500 records for each HMO, depending on the number of enrollees each served. Delmarva used encounter data to identify Maryland Medicaid managed care patients with specific diagnoses and was thus able to sample enough cases from each HMO with hypertension, pregnancies, and mental health conditions to draw conclusions about care to subpopulations in its focused reviews. The Maryland Healthy Kids program reviewed the quality of care to children in each HMO and the Healthy Kids data was also included in Delmarva's final report.

Third, Delmarva gave plans percentage scores for the systems review in Maryland and narrative scores for the systems review in the District. Families USA did not, however, find this to be an important difference because neither state weights how heavily given performance standards factor into an overall score. For example, in the evaluation of patient rights for District HMOs, five standards relate to a member handbook and only two relate to an HMO's response to grievances. Thus, a District HMO that failed to respond to complaints in a timely manner, did not track complaints or use information from them to improve quality, and did not have internal procedures for handling appeals still received a patient rights score, "Majority of standards were met."

A. Questions Pertaining to Performance Standards and Systems Review

The performance standards sections of Maryland reviews and the systems reviews for the District examine many of the same questions:

Quality assurance: Does the managed care organization have a written plan for assuring quality that includes objectives and mechanisms for monitoring quality and remedying quality problems? Has this quality assurance program been implemented, and is it impacting on the quality of care? Does the plan systematically collect and analyze data in order to identify quality problems, resolve them, and plan future activities?

Peer review (or utilization review): Does the organization collect and evaluate data regarding the use of services, and does the governing board monitor its utilization management program?

Enrollee rights: Does the organization clearly outline rights and reponsibilities of enrollees in an enrollee rights policy? Has it disseminated the policy to enrollees and to participating providers? Have enrollees been afforded the opportunity to lodge complaints, and does the organization have appropriate procedures in order to act on complaints, analyze complaint data, provide an appeal process, and give feedback to providers?

Enrollment: Does the plan have written enrollment procedures, a member services department, a clear member handbook with information in languages of major populations served, lists of providers accepting new clients by geographic area, and providers equipped to meet needs of special populations? Does the plan provide orientations for new members?

Disenrollment: Does the HMO have a written plan for disenrollment procedures and describe procedures for voluntary disenrollment clearly in its enrollee handbook? Does it monitor voluntary and administrative disenrollments? Does the plan work to resolve any systemic problems that members identify as reasons for voluntary disenrollment?

Patient satisfaction: Does the HMO periodically survey members' satisfaction, evaluate the results, and address causes of dissatisfaction by upgrading its services as needed?

Health education: Has the HMO developed and implemented a health education program and reported to its board on the program's effectiveness?

Medical records: Does the HMO have an appropriate medical records system including procedures to safeguard confidentiality? Do medical records reflect an adequate evaluation of patients? (These questions are examined in the performance standards section of the Maryland review and in the ambulatory medical records section of the District review.)

Data system: Does the HMO have appropriate equipment, staffing, and capabilities to collect and manage data? (The District reviews do not include this question.)

B. Questions Pertaining to Actual Care

In the portions of the external reviews that examine actual care—that is, the "focused review" in Maryland and the "ambulatory medical records

review" in the District—reviewers examined different questions in Maryland and the District, as shown in the following table:

Maryland 1997 Percentage Scores

Performance Standards Review 1997

FreeState
(FSHP)
Optimum
Choice
(OCI)
Prudential
(PHCP)
Total
Health
Care (THC)
United
Health Care
/Chesapeake (UHC)
Overall Score77979083

77

Quality Improvement

81

100

88

87

60

Peer Review

73

100

100

73

64

Enrollee Rights

67

83

83

83

83

Enrollment

100

100

67

100

100

Disenrollment

67

94

71

100

88

Patient Satisfaction

61

92

100

69

78

Health Education

67

100

100

83

42

Medical Records

100

100

100

67

100

Data Systems

78

100

100

83

78

Medical Records Review1997
FreeStateOptimum ChoicePrudentialTotal Health CareUnited Health
Care/Chesapeake
Overall Raw Score Per Plan7269737062
Overall Weighted
Score Per Plan
7370737063
Continuity of Care8274847670
Hypertension7172726157
Immunization7070854345
Mental Health7478657761
New Enrollee4451575643
Pregnancy7560816969
Child Health Review5185828875
Rereview5333455633
Ambulatory74n/an/an/an/a

FreeState Health Plan (FSHP)


Performance Standards Review

FSHP scored below the threshold in enrollee rights (67%, an improvement from 50% the previous year); disenrollment (67%, a decline from 75% the previous year); patient satisfaction (61%, an improvement from the previous year's score of 39%); and health education (67%, an improvement from 41% in 1996). Problems in these areas were as follows:

The enrollee handbook did not list the right to appropriate treatment of minors and did not explain how to change practitioners. FSHP's aggregate complaint data did not sufficiently describe problems or distinguish between complaints and inquiries. No procedure was in place for feedback to providers regarding complaints.

Written policies did not explain how FSHP would handle improperly completed disenrollment forms. Disenrollment reports did not aggregate data by provider and location and did not distinguish voluntary from administrative disenrollments.

FSHP did not use patient satisfaction survey results to improve quality because, FSHP's Quality Improvement Committee determined, results were only "slightly problematic" and required no corrective action. Reviewers wrote that FSHP must describe the methodology for determining whether a problem was significant.

Documentation was lacking about the qualifications of educational staff, the presence of educational representatives at each site, and evaluation of health education efforts.

While subsequent corrective action plans adequately addressed enrollment and disenrollment concerns—for example, FSHP revised its member handbook—reviewers stated that corrective action plans did not address all concerns with patient satisfaction or with health education.

Medical Records Review

FSHP's overall raw score on medical records was 72% (weighted score 73%). FSHP scored below the 70 percent threshold in the areas of new enrollee and re-review. The 1997 review report for FSHP did not list 1996 results in these areas because the plan had not then been operating long enough to establish a reasonable baseline.

New Enrollee:

Only 35 of 80 new enrollees sampled received timely initial health assessments. FSHP scored 44% in this area in 1997.

Re-Review:

Of 30 cases that had problems identified in 1996, 14 still had not been corrected on re-review. FSHP scored 53% in this area in 1997.

Corrective action plans adequately addressed these areas. FSHP developed and implemented a process to ensure timely initial health assessments, addressed quality problems in the 14 indivdual cases, and developed a system to track and monitor audited cases in the future.

Healthy Kids Review

On the Healthy Kids Review, FSHP's composite score was 51 percent, well below the 70% needed to achieve a satisfactory score. On each component (health and developmental history, comprehensive physical exam, lab tests, immunizations, and health education), fewer than 60 percent of children sampled received appropriate screening—scores ranged from 38 to 57 percent.

In its corrective action plan, FSHP submitted a draft policy and procedures manual and other measures to ensure better Healthy Kids screening. Reviewers accepted this corrective action plan and will evaluate its impact in the next review.

Optimum Choice(OCI)


Performance Standards Review

In its performance standards review, OCI met program expectations, scoring above the 70 percent threshold in every area.

Medical Records Review

On the medical records review, OCI's overall raw score was 69% (weighted score 70%). Areas of concern were new enrollees; pregnancy; and re-review

New Enrollees: Forty of eighty new enrollees sampled did not receive timely initial health assessments. OCI scored 51% on new enrollees in 1997 and 38% in 1996.

Pregnancy: Twelve of 15 pregnancy screens failed the review, and Delmarva noted that OCI had failed on 11 of these screens in both of the previous two years as well. Failed FY 97 screens included: first trimester diagnostic procedures, glucose test, hematocrit, alpha feto protein, urine, office visits, office visit evidence, standard physical findings, health education, adequate outreach, postpartum examination, and other quality concerns. OCI scored 60% on pregnancy in 1997 and 47% in 1996.

Re-review: Thirty-seven of 55 cases that were re-reviewed because quality problems were identified in the previous year failed review again. OCI scored 33% on re-review in 1997 and 26% in 1996.

OCI was not a Medicaid HMO after June 30, 1997 and therefore was not asked to submit corrective action plans regarding medical review areas. Instead, OCI planned to notify physicians of individual quality concerns so that physicians could take appropriate action.

Healthy Kids Review

On the Healthy Kids review, OCI received a composite score of 85% and exceeded minimal standards on all components.

Prudential (PHCP)


Performance Standards Review

Prudential (PHCP) scored below the threshold in meeting enrollment standards (67%, a decline from 100% in 1995 and in 1996).* Maryland requires translation of member handbooks and enrollment materials if more than 10% of a plan's population speaks a language other than English. PHCP provided materials only in English and did not have a mechanism in place to determine whether its members spoke other languages. PHCP also lacked a written enrollment plan. Materials did not inform members about how to select a provider and how to access urgent and emergency services. The enrollee handbook did not explain how members could disenroll from the HMO. PHCP responded to the review with an enrollment plan and made some changes to the enrollment handbook but did not address all of the reviewers' concerns.

Medical Records Review

Overall, PHCP scored 73 percent on medical records. It scored below the 70 percent threshold on mental health, new enrollees, and member rights.

Mental health: Individual treatment plans were often completed without patient participation, and were often not signed or updated. Prudential scored 65% on mental health in 1997 and 66% in 1996. Reviewers noted that the same areas remained problematic for PHCP both years.

New enrollees: Many new enrollees did not receive initial health appraisals within allowable timeframes. PHCP's score for new enrollees declined to 57% in 1997 from 61% in 1996. In its 1997 corrective action plan, Prudential reported that a change in the system will correct this in the future.

Re-review: When 1996 reviews indicated quality concerns with patient records, these patients' records were reviewed again in 1997. In many cases, problems had not been corrected at the time of re-review. After excluding cases in which patients were noncompliant, 26 of 47 re-reviewed cases failed criterion again in 1997, leaving PHCP with a 45% re-review score for 1997, a decline from PHCP's 1996 re-review score of 65%.

PHCP's subsequent corrective action plans adequately addressed concerns with new enrollees and with re-review. PHCP resumed use of a protocol for initial health appraisals and addressed quality concerns with a group of physicians. PHCP's corrective action plan regarding mental health care was not adequate because it did not assure regular updating of individualized treatment plans or address documentation and tracking of referrals.

Healthy Kids

PHCP's composite score on child health was satisfactory. However, the immunization score of 68% was an area of concern.

Total Health Care (THC)

Performance Standards Review

THC received an overall score of 83%. It scored below the 70 percent threshold on patient satisfaction (69%, a decline from 81% in 1996) and medical records (67%, a decline from 100% in 1996). The problems were as follows:

The governing body did not approve the procedure for evaluating patient satisfaction. Procedures did not incorporate a methodology for reevaluating the effects of corrective actions.

The HMO did not have a procedure for reviewing medical records for legibility, organization, completeness, and conformance to standards.

Reviewers noted that THC's corrective action plans did not adequately address these concerns.

Medical Records Review

Overall, THC scored 70 percent on medical records (weighted score also 70%). Areas of concern were hypertension; immunization status at 24 months; new enrollees; pregnancy; and re-review.

Hypertension: THC failed to provide patient education, adequate outreach, referrals to appropriate agencies, or frequent and intensive services on a number of cases sampled. THC scored 61% on hypertension in 1997 and 64% in 1996.

Immunizations: Only 26 of 60 children sampled were up-to-date on immunizations at 24 months. THC scored 43% on immunization status in 1997, a decline from 60% in 1996.

New enrollees: Forty-five of the 80 new enrollees reviewed received timely initial assessments. THC scored 56% on initial health assessments in 1997 and 45% in 1996.

Pregnancy: THC failed eight pregnancy screens (glucose test, hematocrit, alpha feto protein, urine screen, office visits according to schedule, standard physical findings, substance abuse referral, and adequate outreach.) Although THC's score on some of these screens had improved from previous years, none had scored above the 70% minimum threshold in the last three years. THC's overall score on pregnancy was 69% in 1997 and 49% in 1996.

Reviewers determined that THC's corrective action plan for immunization status was adequate—THC obtained new computer software to track immunizations, began bi-monthly peer reviews of immunization status, and developed protocols regarding immunizations. To address problems in other areas, THC submitted a hypertension protocol and planned monthly peer review audits of hypertension, developed outreach letters for new enrollees, centralized its obstetric services to a single provider group, and revised its medical records format. However, reviewers noted that these corrective actions did not adequately address concerns.

Healthy Kids Review

Total received a composite score of 88 percent and exceeded minimum thresholds on each component.

Total Health Plan is not a Health Choice contractor in 1998 but does have a subcontract with FreeState Health Plan to serve the Medicaid population in Maryland.

United HealthCare/Chesapeake (UHC)

Performance Standards Review

In its performance standards review, UHC scored below the threshold in health education (42%, a decline from 100% in previous years); peer review (64%, also a decline from 100% in previous years); and quality improvement (60%, also a decline from 100% in previous years.) In these areas, the problems noted were as follows:

The Board of Directors only met once during the year, and did not oversee health education, utilization management, or quality improvement, as required.

The health education plan did not include a follow-up mechanism to determine the impact of health education on Medicaid enrollees. The health education plan did not include health education staff credentials or provide for educational representatives at each site.

There were no measurable goals of utilization management and no specific educational programs for providers or staff regarding utilization. There was no documentation of Board involvement in utilization management.

Although quality improvement studies were developed, no documentation showed that the studies were completed.

Reviewers found that UHC's subsequent corrective action plans did not adequately address concerns in any of the above areas. For example, in the area of quality improvement, UHC developed corrective action plans to improve the quality of individual providers' care but had no timeframes to monitor corrective action and no plans to follow-up if the action was not effective.

Medical Records Review

UHC's overall raw score for medical records review was 62% (weighted score 63%). It scored below the 70% threshold in hypertension, immunization status at 24 months, mental health, new enrollees, pregnancy and re-review.

Hypertension: UHC failed to provide services at appropriate frequency and intensity, to provide patient education, to refer to appropriate agencies, to provide periodic physical assessment, and to provide adequate outreach. UHC scored 57% in this area in 1997 and 62% in 1996.

Immunizations: Only 27 of 60 children sampled were up-to-date on immunizations at age two. UHC scored 45% in this area in 1997, a decline from 68% in 1996.

Mental health: Problems were found in medical history, patient education, referrals, case manager tracking, after-care, and other areas. UHC scored 61% in this area in both 1997 and 1996.

New Enrollees: Fewer than half of new enrollees sampled had timely initial health assessments. UHC scored 43% in this area in 1997 and 34% in 1996.

Pregnancy: Although UHC's pregnancy score in 1997 improved over previous years, fewer than 70 percent of cases sampled met certain screens: glucose test, hematocrit and RPR, alpha feto protein, office visits according to schedule, adequate initial visit, adequate outreach, postpartum evaluations. UHC scored 69% on pregancy care in 1997, and 37% in 1996.

Re-Review: Of 67 cases that were reviewed again in 1997 because of quality problems reviewers identified the previous year, 47 still did not have problems corrected. UHC scored 33% on re-review in 1997, a decline from 65% in 1996.

To correct these problems, UHC planned to hire and train a unit to provide peer review of hypertension cases, immunization status, and pregnancy care and to follow up on individual quality concerns. UHC also terminated a contract with a mental health vendor. However, reviewers noted that these subsequent corrective action plans did not adequately address any of the above areas of concern.

Healthy Kids Review

On the Healthy Kids review, UHC received a satisfactory composite score but needed quality improvement in the area of laboratory tests and immunizations. Of the 60 cases reviewed, 68% were up-to-date in immunizations and 68% had appropriate laboratory tests.

II. DISTRICT OF COLUMBIA RESULTS

Summary

Families USA reviewed the Fiscal Year 1995/1996 reports for four District of Columbia Medicaid managed care plans. For all but one HMO, the 1996 review actually encompassed two fiscal years: September 1, 1994 through August 31, 1996.

Overall, the systems reviews show:

Three plans provided an "adequate" quality of care, and one, D.C. Health Cooperative, was "not adequate."

Most D.C. Medicaid managed care plans did not have regularly meeting quality assurance committees. Three of the four plans did not periodically review provider's credentials.

One plan (D.C. Health Cooperative) did not have an internal peer review system at all. Three plans did not systematically collect and review data on performance and patient results.

In their member handbooks, two plans (Chartered and D.C. Health Cooperative) did not explain patient rights; two plans (George Washington and Prudential) did not explain how to disenroll; and one plan did not explain grievance procedures. One plan (Chartered) did not properly respond to grievances.

Three plans did not adequately use patient satisfaction surveys in their quality assurance program, and two (Chartered and D.C. Health Cooperative) did not complete patient satisfaction surveys.

Three of the plans had deficiencies in their health education plans and activities – one (D.C. Health Cooperative) did not have a health education plan at all but rather left responsibility for health education to doctors. Two plans (D.C. Health Cooperative and George Washington) had no system to track missed appointments.

All plans were cited for not analyzing the reasons that members disenrolled from the plan, but this was because the District (through another contractor) processed disenrollments and did not provide information about reasons for disenrollment back to the plans.

In the ambulatory care records review, three plans scored above the 70 percent threshold in all areas. In the other plan, Prudential, only 65 percent of cases reviewed showed appropriate patient education and medical and social services; Prudential scored above the 70 percent threshold on other ambulatory care screens. Due to the very small number of records sampled in each HMO, however, Delmarva was not able to draw conclusions about the adequacy of care in important areas such as appropriate prenatal care and hospital discharge and follow-up care. In two plans, even though scores for most individual screens were well above 70 percent, many records had quality concerns. Forty-five percent of D.C. Health Cooperative records reviewed had quality concerns and forty percent of Prudential records reviewed had quality concerns.

In the medical records content portion of the review, plans scored below 70 percent on many screens. All plans showed deficiencies in documenting patient education and in maintaining a list of medications on medical records. Other problem areas are noted on the attached chart.

District of Columbia Medicaid administrators asked plans for documentation that they had corrected problems identified in the 1995-1996 reviews when they entered into new contracts with managed care organizations.6

District of Columbia HMOs

Ambulatory
Record/Screens

Chartered
(CHP)
DC Health
Cooperative
(DCHC)
George
Washington
(GWUHP)
Prudential
(PHCP)
Average*
Adequacy of diagnosis94961009897
Necessary diagnostic
procedures done appropriately
8085959288
Timely completion of necessary
diagnostic procedures
9892979495
Use of effective, appropriate
treatment modalities
9498989797
Access to services including
necessary off-site referral
97n/a978693
Outreach and follow-up for
enrollees falling to present
9497979596
Appropriate patient education
and medical social services where required
8080906579
Appropriate hospital dischargen/an/an/an/an/a
Appropriate follow up after
hospital discharge
n/an/an/an/an/a
Appropriate prenatal caren/an/an/an/an/a
Compliance with requirement
for completion of initial health
appointment
9485997989
Adherence to pediatric periodicity
schedules where appropriate
n/a76848281
Other quality concerns9296959494
Plan Average*91899588
Number of records with quality concerns20683160
Number of records reviewed75150149150
Percentage of records reviewed
that had quality concerns*
27%45%21%40%

*Computed by Families USA

Medical Records
Content Generic Screens

 

 

Chartered
(CHP)
DC Health
(DCHC)
GW
(GWUHP)
PrudentialAverage
Patient name, date of birth,
sex, address, phone number
81989710094
Next of kin, sponsor, or responsible party6891698679
Practitioner's name and profession100911009697
Legible entries9290978491
Diagnostic summary lists7561947970
Diagnostic summary list used appropriately7358417261
Allergies recorded on summary list or
consistently at another location
6073518467
Medication lists maintained well5945316851
Medical History6081796672
Physical exams if enrolled for at least a year6074747370
Health/patient education documented6256655660
Referrals/reasons for and results of referrals74

n/a

66

80

73

Documentation of emergency encounters and follow-up

n/a

n/a

n/a

93

93

Hospitalization orders and discharge summaries;
HMO encounters during hospitalization

n/a

n/a

n/a

89

89

Medical social service provided

n/a

n/a

n/a

98

98

Documentation of broken appointment and recall efforts

66

n/a

50

89

68

Date

98

97

100

99

99

Chief complaint or purpose of visit

98

98

100

100

99

Objective findings

98

96

99

99

98

Diagnostic or medical impression

94

94

99

99

97

Studies ordered (lab, radiology, EKG); indication of
review results

81

76

95

91

86

Therapies administered and/or treatments given

100

93

97

98

97

Disposition, recommendations, and instructions to patients

81

68

86

82

79

Signature or initials of provider (if more than
one person writes in the record)

96

92

99

97

96

Overall organization

85

91

98

95

92

Plan Average*

79

81

79

87

# Scores Below 70*

7

5

8

3

*Computed

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Families USA

RESULTS FROM EACH DISTRICT OF COLUMBIA MEDICAID MANAGED CARE PLAN

Chartered Health Plan (CHP)


Problems noted in the 1996 review were as follows:

Systems review

The Quality Assurance Committee did not adhere to its meeting schedule, and governing body minutes did not reflect a review of the quality assurance plan.

The member handbook did not explain all enrollee rights or how to access all services.

Chartered did not have a system in place to respond to complaints in a timely manner, procedures for handling urgent grievances, appeal procedures, or procedures for tracking complaints that had been sent to other providers or Departments for review and resolution. A random review showed that grievances and urgent grievances were not always handled in a timely manner.

Chartered did not adequately track reasons for disenrollment and did not have a method to notify members of the status of disenrollment requests.

Chartered did not conduct a patient satisfaction survey for 18 months and did not survey enrollees who had no phones. When Chartered did survey patients, it reported results in ways that could be misleading. (For example, Chartered showed that the percentage of members that found it easy to get an appointment improved slightly between the third and fourth quarter. Actually, the same number of respondents had trouble scheduling appointments both quarters but two additional members were surveyed in the fourth quarter who did not have problems scheduling appointments.)

Chartered submitted a corrective action plan that partially addressed the areas of concern. For example, Chartered revised its member handbook and developed appropriate complaint procedures but, as of the date of the final external review report, had not explained how it would follow-up on concerns patients reported in the patient satisfaction survey or how it would revise survey questions.

Medical Records Review

Quality Concerns: Seventy-five medical records were reviewed in 1996, which include 22 records regarding deaths. No screens scored below the 70 percent threshold but 20 cases had quality concerns. Chartered had to submit corrective action plans for 13 of these cases, and the corrective action plans adequately addressed concerns. One of two physicians with identified quality concerns was terminated from the network. No quality concerns were identified regarding the deaths.

Medical Records Content: In medical records content, Chartered scored below 70 percent on the following seven areas: next of kin, sponsor, or responsible party; allergies recorded; medication list maintained well; medical history; physical exam if enrolled for at least one year; health/patient education documented; and documentation of broken appointments and recall efforts. To correct these problems, Chartered planned to review medical records for all primary care and obstetrician/gynecologists physicians annually and use results as part of their recredentialing process.

DC Health Cooperative (DCHC)

Problems noted in the 1996 review included the following:

Systems Review

The Quality Improvement Committee did not meet consistently. No notes demonstrated reporting of quality issues or grievances. No evidence showed that major health problems had been identified for either the enrolled population or special population groups, or that DCHC had developed programs and services based on identified health problems. Minutes of the Governing Body did not reflect discussion of quality problems or a quality plan and no documentation showed that quality of care issues and resolutions were communicated to providers.

DCHC did not adhere to its utilization review meeting schedule, did not aggregate and analyze utilization data, had not documented what professionals were responsible for utilization review, did not monitor quality through its utilization review department (DCHC disagreed with this finding), and had developed only one treatment protocol which was for early and periodic screening, diagnosis and treatment of children (EPSDT).

The member handbook did not explain member rights and responsibilities.

DCHC did not pattern reasons for disenrollment. However, they planned to do so once this information was provided by DC.

DCHC did not survey patient satisfaction. After the on-site visit by external reviewers, DCHC did conduct a patient satisfaction survey in November 1996 and made plans to use a standard industry patient satisfaction survey the following year. Reviewers commented that DCHC still needed to develop a policy outlining the purpose and methodology of surveys to ensure that they address enrollees' needs.

DCHC had not developed a health education plan. Instead, it had delegated responsibility for health education to primary care physicians and did not monitor health education.

DCHC did not have a system in place for tracking missed appointments or providing outreach (except to persons with HIV/AIDs) to ensure compliance with medical therapy.

DCHC's corrective action plans addressed some, but not all, of the above areas. For example, DCHC developed and distributed information about enrollee rights and responsibilities, resolving concerns in the patients rights section of the review. DCHC also formed a combined Quality Improvement and Utilization Management Committee but corrective action plans in these areas still did not address all of reviewers' concerns.

Medical Records Review

Quality Concerns: In the ambulatory review, Delmarva reviewed 150 records from DCHC. Although DCHC did not score below 70 percent on any screen, 68 cases had identified quality concerns. Although DCHC was required to submit individual corrective action plans for 55 of these cases, it did not do so. Instead, DCHC submitted a global corrective action plan which did not adequately address concerns. DCHC's corrective action plan for two identified physicians did not adequately address concerns either. No quality of care concerns were identified regarding the two deaths during the review period.

Medical Records Content: In the review of medical record content, DCHC scored below 70 percent on the following screens: diagnostic summary list; diagnostic summary list used appropriately; medication list maintained well; health/patient education documented; and disposition, recommendations, and instructions to patients. DCHC revised its medical records protocol to include more information in response to the review findings.

George Washington University Health Plan (GWUHP)

The 1996 review identified the problems in the following areas:

Systems Review

The quality assurance plan did not have clearly documented goals and measurable objectives. Governing body minutes did not reflect discussion of problems identified through quality assurance efforts.

Although utilization of services is evaluated according to treatment protocols, over- and under-utilization of services is not assessed by peer reviewers. Utilization data was not trended and there was no documentation that utilization review results were reported the governing body.

The member handbook did not explain disenrollment procedures. The Member Grievance and Claims Appeal Process explained that the Member Services Department would handle complaints, but did not explain how members could complain or how the Department would handle their complaints.

Disenrollment surveys did not separate Medicaid members from non-Medicaid members and response rates were poor. GWUHP did not determine reasons for disenrollment by the Medicaid population.

Similarly, patient satisfaction surveys did not separately examine the satisfaction of DC Medicaid enrollees. Overall survey results showed dissatisfaction with phone access, appointment access, and waiting times at the downtown site and GWUHP recommended that a quality improvement group be convened to address problems. However, no documentation showed that a group was developed for this purpose. In a corrective action plan, GWUHP worked to improve telephone access by decentralizing provider phone numbers and breaking out Medicaid beneficiaries in future surveys. Timeframes for re-surveying were not delineated.

There were no attendance rosters or evaluations of health education sessions.

GWUHP had not developed a system to track missed appointments. Although GWUHP contracted with case managers to provide outreach, there was no system in place for providers to identify and refer enrollees needing outreach services to the case managers.

Medical Records Review

Quality Concerns: 149 medical records were reviewed for 1995 and 1996. GWUHP received no scores below the 70 percent threshold for ambulatory care. However, 31 cases had quality of care concerns, and GWUHP had to submit corrective action plans for nine of these cases. The majority of the nine corrective action plans did not adequately address concerns. One physician was counseled in response to quality concerns and was scheduled to be subject to a compliance audit. One death was identified during the review period and it did not present quality concerns.

Medical Records Content: In medical records content, GWUHP scored below 70 percent on: next of kin, sponsor or responsible party; diagnostic summary list; diagnostic summary list used appropriately; allergies recorded; medication list maintained well; health/patient education documented; referrals, reasons for, and results of referrals; and documentation of broken appointments and recall efforts. GWUHP did not submit a corrective action plan addressing the concerns identified for seven physicians.

Prudential Health Care Plan (PHCP)

The 1996 District review of Prudential showed problems in the following areas:

Systems review

For seven months, no meetings of a quality improvement committee were documented. Minutes did not reflect a review of grievances.

PHCP did not separately evaluate utilization by Maryland and by District residents.

Although PHCP conducted health education, PHCP was urged by reviewers to collect attendance rosters and evaluations of sessions; and to ensure that educational efforts were targeted to identified community health problems.

PHCP submitted a corrective action plan which partially addressed the above concerns.

Medical Records Review

Patient Education: Overall, PHCP scored less than 70 percent on the ambulatory record screen "appropriate patient education and medical social services."

Quality Concerns: Sixty cases presented quality concerns at four physician sites. Three of the physicians involved submitted corrective action plans. PCHP had to submit corrective action plans for 26 of the cases, but the corrective action plans did not adequately address concerns.

Medical Record Content: PHCP also scored below the 70 percent threshold on three screens regarding medical record content: medication list maintained well, medical history, and health/patient education documented. Prudential responded with a corrective action plan to improve documentation.

Deaths: During the review period, 16 Prudential patients died and in those patient records, one quality of care concern was identified, but information about this concern is not publicly available. (The quality concerns may not have had any relationship to the death.)

CONCLUSION

Even though Maryland and the District of Columbia used the same peer review organization to externally review the quality of Medicaid managed care plans, the two jurisdictions obtained very different products. Maryland's managed care plan contracts were more specific than the District's regarding care standards. In Maryland, reviewers sampled enough records of patients with specific clinical diagnoses in order to draw conclusions about these subpopulations' care. Both of these factors made the Maryland reviews more useful than the District reviews in identifying problems with the provision of care.

In both jurisdictions, the external quality reviews address administrative systems or "performance standards" as well as actual care. Maryland reviews contained numerical scores on performance standards while District of Columbia reviews contained a narrative description as to whether the majority of standards in a given area were met. However, since neither jurisdiction weighted the importance of various administrative systems, we did not find summary data from the performance standards review to yield a very meaningful comparison of plan quality.

Reviews did show a number of problems with the quality of care in both District of Columbia and Maryland Medicaid managed care organizations. In the District of Columbia, many plans lacked internal quality assurance mechanisms, such as committees that met regularly to review care. Plan member handbooks did not adequately explain patient rights. Medical records were deficient in most plans. Due to small sample sizes, reviewers were not able to draw many conclusions about direct patient care. In two health plans, reviewers identified quality concerns with at least 40 percent of the cases they reviewed. Descriptions of these quality concerns are not available to the public, so we could not determine whether concerns reflected problems with actual care or problems with documentation of care.

Maryland reviews showed that all plans were deficient in scheduling initial health assessments for new enrollees. Three plans had significant quality problems in pregnancy care, two in immunizations for children, and two in hypertension care.

In both jurisdictions, Medicaid managed care organizations often failed to correct identified quality problems. Reviewers found managed care organizations' corrective action plans to be deficient in many cases. In Maryland, reviews in successive years often showed declining scores in performance standards, and care in review areas such as pregnancies remained substandard in some HMO's for several years. When the external review organization re-examined patient records to see if quality concerns had been remedied, very often Maryland managed care organizations had failed to correct identified problems.

Managed care plans have not been sanctioned for failing to meet standards in either the District of Columbia or Maryland. In order to strengthen Medicaid managed care oversight, both jurisdictions need to establish stronger procedures to ensure that once reviews identify problems, managed care organizations act to correct them and prevent their recurrence.

END NOTES

1 Social Security Act (42 USC 1396(a)(30)).

2 That is, the description of tasks to be performed by the external review organization as specified in the contract.

3 These were the most recent reviews available to the public when our study began. Fiscal Year 1998 reviews for Maryland were recently released.

4 42 C.F.R. §434.53 and 42 C.F.R. §434.63.

5 Maryland adopted new Medicaid managed care regulations effective November 1996, which require managed care organizations to meet specific quality standards. During the first year that they serve Maryland Medicaid beneficiaries, managed care organizations must receive scores of at least 70 percent in each clinical area reviewed. In each subsequent year, they must improve at least five percent until they reach a 90 percent score in each clinical area. Minimum compliance scores for the performance review areas vary between 70 percent and 100 percent on various standards. Under the November 1996 regulations, which will pertain to the FY 1998 reviews, if managed care organizations do not receive the required scores for either systems performance or clinical areas, they must submit a corrective action plan and can be sanctioned. (Title 10 Department of Health and Mental Hygiene, Subtitle 9, Chapters 65.03 and 73.01.)

6 Information from Jane Thompson, Chief, Office of Managed Care, District of Columbia Department of Health, December 1998.

[Return to top]



Update Your Profile | Site Map | Privacy Policy | Contact Us | Printer-Friendly Version | Copyright and Terms of Use