DOCUMENT RESUME ED 363 662 AUTHOR Bushery, John M.

2y ago
26 Views
2 Downloads
293.61 KB
7 Pages
Last View : 9d ago
Last Download : 3m ago
Upload by : Camryn Boren
Transcription

DOCUMENT RESUMEED 363 662AUTHORTITLEPUB DATENOTEPUB TYPEEDRS PRICEDESCRIPTORSIDENTIFIERSTM 020 784Bushery, John M.And OthersThe Schools and Staffing Survey: How ReinterviewMeasures Data Quality.[93]7p.; Author is affiliated with the National Centerfor Education Statistics (NCES).ReportsDescriptive (141)ReportsEvaluative/Feasibility (142)MF01/PC01 Plus Postage.Administrator Characteristics; Comparative Testing;Elementary Secondary Education; *Followup Studies;*Interviews; *Mail Surveys; *National Surveys;Questionnaires; Reliability; *Research Methodology;Responses; School Personnel; *School Surveys;Secondary School Teachers; Teacher Characteristics;Teacher Supply and Demand; Telephone SurveysData Quality; *Schools and Staffing Survey (NCES)ABSTRACTThe first mail reinterview ever conducted by theCensus Bureau was conducted as part of the Schools and StaffingSurvey (SASS) of 1991. Because there were some questions about thereliability of items from the 1988 SASS reinterview, the 1991 studywas examined by a mail reinterview. The methodology and responses arediscussed, and response variance rates for questions reinterviewed inthe 1988 and 1991 SASS cycles were compared. Mail reinterviewrespondents displayed lower simple response variance than under thetelephone-telephone procedure. Findings suggest that mail reinterviewrespondents are cooperative and careful, although they may leave moredifficult or uncertain questions blank. Im?lications of thesefindings are discussed. Mail surveys may provide data that are asgood or better than some surveys now conducted by telephone or inperson. Future survey plans are reviewed. Six tables present studyfindings. (Contains 7 references.) **************************Reproductions supplied by EDRS are the best that can be madefrom the original ******************************

U.S. DEPARTMENT OF EDUCATIONOffice of Educebonai Research and ImprovementEDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)THE SCHOOLS AND STAFFING SURVEY:HOW REINTERVIEW MEASURES DATA QUALITYJohn M. Bushery and Daniel Royce, Bureau of the Censusand Daniel Kasprzyk, National Center for Education StatisticsKEY WORDS: Data quality, reinterviewThis document has been reproduced asrecerved Iron, the person or organizatronorigrnatmgC.' Minor changes have Dean made lo iffiProvereproduclron ouahtvRoams of inew or opinrons stated rn this doCumem do nol necessanly represent &boarOERI position or dobCyTo estimate response variance accurately, thequestions which need improvement. We believe wesurvey error model assumptions require the reinterview to be an independent replication of the originalinterview. Independence is difficult to achievebecause the respondent might remember his or heranswer to the original interview question. To theextent a reinterview lacks independence, responsevariance may be underestimated. Operational con-have improved one aspect of SASS data quality,straints often make it difficult or impossible toin part because the SASSconduct the reinterview as an exact replication of theoriginal interview. When a reinterview does notreplicate the original interview perfectly, the differ-1. INTRODUCTIONThe Schools and Staffing Survey (SASS) is agood example of bow a reinterview program cancontribute to improved data quality by identifyingsimple response variancereinterview program identified questions needingimprovement.The 1991 SASS reinterview results also suggestthat mail respondents provide more reliable data thanthose interviewed in a telephone follow-up operation.The SASS Surveys1.1The National Center for Education Statistics(NCES) sponsors, and the U.S. Bureau of the Censusconducts, the Schools and Staffing Survey (SASS) toprovide data on teachers, school administrators,schools, and local education agencies.The SASS runs on a three-year cycle, the first in1987-88 and the second in 1990-91. The CensusBureau conducts the SASS by mail, with telephonefollow-up of cases not responding by mail.ences in methodology may overstate the responsevariance.The SASS reinterviews fail to replicate theoriginal interview in two respects:All SASS reinterviews contained fewer questionsthan their original counterparts.The original SASS surveys used self-ad ninisteredmail-return questionnaires (with telephone follow-upof non-respondents). Except for the 1991 SASSSchool Survey, all the reinterviews were conductedby telephone.We conducted the Census Bureau's first-ever mailreinterview in the 1991 SASS School Survey. Someof the 1988 SASS reinterview fmdings suggested thatfor some questions, the reinterview model assump-Mail response rates range from 49 percent (forprivate schools) to 80 percent (for public schooltions were not adequately met [2]. Section 2.3discusses this topic in more detail. These resultsadministrators), with final response rates between 83prompted us to evaluate the 1991 SASS School questionnaire through a mail reinterview.(private school teachers) and 97 percent (publicschool administrators again). We completed onesixth to one-third of the cases using telephone followup.1.2The SASS Reinterview ProgramTwo major purposes of reinterview programs arequality assurance and estimating response error [1].The SASS reinterviews estimate simple responsevariance, a measure of the inconsistency betweenresponses over repeated applications of a question.Our main goal is to identify questions needing improvement for the next cycle of SASS. We identifyproblematic questions in the reinterview and followup with cognitive research and other questionnairedesign techniques to make the improvements.1.3Response Variance MeasuresResponse error consists of response variance andbias. The Census Bureau estimates two main metrics(from unweighted data) to quantify response variance,the gross difference rate and the index of inconsistency. In a categorical variable, one-half the grossdifference rate equals the simple response variance.The gross difference rate also represents the proportion of respondents who change their answers fromone interview to the next. In a question with a grossdifference rate of 20 percent, one fifth of the respondents changed their answers.The index of inconsistency is a relative measureof response variance. A simplified definition of the2

index is the ratio of the simple response variance tothe total variance of a characteristic. The L-foldindex of inconsistency is a weighted average of theindices over all categories in a multi-category question. An index of 50 means that half the total valiance of a characteristic can be attributed to responsevariance.Experience provides a rough rule ofthumb for interpreting the index of inconsistency. Ifthe index is:less than 20, response variance is low.between 20 and 50, response variance is moderate.greater than 50, response variance is high.High response variance means the question itselfcauses at least as much of the variability in the dataas the variability among respondents in the population. Two reasons for high response variance are:The question is poorly worded and confuses therespondent.The information requested is too difficult for therespondent to provide.Because the index of inconsistency estimates theratio of hvo variances, the index itself has highdifferent sets of questions for the two reinterviews.We evaluated 15 factual questions common to bothcycles of SASS. Eleven of these question6 receivedsignificant revisions in i991. Four of the revisedquestions displayed reduced response variance. Ourquestion improvement efforts have paid off, at leastpartially.SASS Reinterview Sample SizesTable 1./198819911309112387%87%112675%1101Administrator SurveyEligible for ReinterviewResponse RateTeacher SurveyEligible for ReinterviewResponse Rate74%School SurveyEligible for Reinterview130987%Overall Response RateAttempted Mail ReinterviewPercent Completed by MailAttempted Telephone Reinterview *Percent Completed by Telephone*variability. If the data don't provide enough cases ineach original-by-reinterview outcome cell, a reliableestimate of the index cannot be computed.112384%50%43%57%41%Includes 80 reinterviews not returned by mailand 85 original mail interviews returned toolate for mail reinterview.2. REINTERVIEW RESULTSTable 2.This paper compares response variance results forquestions reinterviewed in both the 1988 and 1991cycles of SASS. Table 1 shows reinterview samplesizes and completion rates for 1988 and 1991. Weused unweighted data and tested all comparisons at 0.10. Tables 3 through 6 display 90 percentconfidence intervals in parentheses.The Administrator and Teacher surveys ask bothattitudinal and factual questions. In 1988 the attitudinal questions we reinterviewed showed high levels ofinconsistency [2]. Inconsistency in attitudinal questions may result from simple response variance orfrom actual changes in attitudes between the originalinterview and reinterview. In 1991, we decided toconcentrate the reinterview on factual questions -with the aim of improving future cycles of the SASS.In the 1988 SASS, we could estimate the index ofinconsistency reliably for 35 of the 45 factual questions we reinterviewed. We estimated the indexreliably for 109 of the 126 factual questions reinterviewed in 1991 [3]. Table 2 summarizes the resultsof both SASS reinterviews.Keep in mind that the distributions in Table 2 arenot strictly comparable. We purposively selectedSummary of SASS Reinterview Results *ResponseVariance1988All Three Components4 (11%)Low14 (40%)Moderate17 (49%)High199143 (39%)38 (35%)28 (26%)Administrator and Teacher Surveys4 (19%)26 (36%)Low8 (38%)26 (36%)Moderateg (43%)High21 (29%)School SurveyLowModerateHigh*0 ( 0%)6 (43%)8 (57%)Questions for which17 (47%)12 (33%)7 (19%)index could be reliablyestimated.2.1Administrator and Teacher Survey ResultsThe two Administrator questions reinterviewed inboth SASS cycles ask whether the respondent earneda bachelor's degree and a master's degree. These"degree earned" questions are virtually the same asthe corresponding Teacher survey questions. Theresults for Administrators were nearly identical to the

.7.Teacher results shown in table 3. The 1988 questionprovided a list of possible degrees and asked therespondent to "mark all that apply." The 1991question asked, "Do you have a bachelor's degree?"If 'Yes," the neni question asked "Do you have amaster's degree?" The remaining degrees (associate,doctor's, etc.) used a 'mark all that apply" approach.Table 3 suggests the direct question format producesmore reliable data for degree earned.Table 3.Teacher Survey Reinterview Results-- Degrees Earned -1988Bachelor's DegreePercent Yes'GDR *97.67.5( 6.0 - 9.2)79.5(64.2 - 98.5)IndexMaster's DegreePercent Yes'GDR *(Index *(199198.10.6( 0.3 1.3)Too few casesdid not mention41.54.33.2 - 5.7)8.96.7 - 11.8)1.9)9.9)36.9(30.1 - 45.3)(Indexrespondents described their assignment as itinerantteacher, long-term substitute, other professional staff,or administrator in the original interview. Only threepercent (s.e. 0.6) selected one of these answers inthe reinterview. The data suggest the 'itinerantteacher" category is the main source of this inconsistency. It may help to define "itinerant" more clearlyon future questionnaires.The 1988 "years teaching" questions asked, ".how many years have you worked as a full-timeteacher in public and/or private schools ." (repeatedfor part-time) and provided a cross-tabulation for therespondent to complete:Years part-timePublicPrivate3.9)6.76.913.78.16.6 -new categories seem to cause respondents someuncertainty - about six percent (s.e. 0.8) of theYears full-time0.6 2.21.2 -Professional Diploma / Ed. SpecialistPercent Yes'4.44.7GDR7.05.2( 4.1 ( 5.6 8.7)6.8)Index69.862.7(56.0 - 87.1)(48.2 - 81.6)Associate DegreePercent Yes'GDRfull-time teachers. These design differences make itdifficult to compare the two questions, but responsevariance on the number of full-time teachers showedno significant change between 1988 and 1991. The5.5 - 8.6)54.2(43.0 - 68.2)In 1991 we changed the format to ask fourseparate questions:" . how many years have you worked as a fulltime teacher in private .,"". part-time in private .,"". full-time in public .," and". part-time in public ."(Table 4.Teacher Survey Reinterview Results-- Years Teaching --Responded "Yes" in original interview.Statistically significant difference between1988 and 1991.1988(In the Teacher survey in both SASS cycles wealso reinterviewed questions on teaching assignment,years in teaching, and plans to remain in teaching (anattitude type question). None of these questionsexhibited significantly improvec' response variance.The teaching assignment questions reinterviewedin 1988 and 1991 were similar but not strictly comparable. In 1991 we reinterviewed a screener questionused to identify teachers, which asked about full andpart-time status and included categories for itinerantteachers, long-term substitutes, other professionalstaff, and administrators (the last two are out ofscope for the Teacher survey). The 1988 questionsimply asked about full-time and four levels of parttime teaching. The 1988 question includes all fulltime teachers, the 1991 figure includes only regularBEST COPY AVAILABLE1991Full-time, PublicGDR1-fold Index(7.69.5)10.88.7 - 13.4)6.1-((7.05.5 -8.9)9.87.7 - 12.4)Part-time, Public9.0GDR12.0)1-fold Index44.4(33.2 - 59.3)( 6.7Full-time, PrivateGDR5.?( 3.6 - 7.4)L-fold Index12.4( 8.7 - 17.7)Part-time, PrivateGDR *3.45.8)38.5(23.0 - 64.4)( 2.1Index*Statistically significant1988 and 1991.4-6.6( 5.0 - 8.6)42.5(32.5 - 55.7)5.3( 3.3 -8.7)8.8( 5.414.4)7.5( 4.8 - 11.6)37.8(24.4 - 58.4)difference between

We grouped the responses into the four categoriesof interest to the NCES:less than three years,three to nine years,10 to 20 years,more than 20 years.Unfortunately, no improvement resulted. Thefull-time estimates enjoyed low response variance inindex) for this question improved slightly, but remains in the moderate range. 'Community' is anboth years, and the part-time estimates exhibitedoffered by the school, "Which of the followingmoderate response variance in both cycles of SASS(Table 4.).The final Teacher question reinterviewed in bothSASS cycles was, "How long do you plan to remainin teaching?" The consistency of this attitude-typequestion decreased between 1988 and 1991. Theprograms and services are available to students in thisimportant variable in the NCES' analyses. Fortunately, the NCES is now able to obtain this informationfrom geographic data files [6], instead of asking theschools. The result will be more accurate data, withreduced respondent burden.We reinterviewed three questions about programsschool, either during or outside of regular schoolhours, and regardless of funding source - bilingual education- English as a second language- extended day or before-or-after-school daycare.'gross difference rate increased from 39.5 percent(36.8% - 42.6%) to 46.8 percent (44.0% - 49.9%)and the L-fbld index increased from 55.4 (51.6 -Table 5.59.6) to 66.6 (62.6 - 71.1). Since we did not changethis question, we speculate that teachers' attitudes in1991 were less stable than in 1988.Increased response variance among public schoolteachers drove the overall decrease in consistency -private school teachers showed no significant changein response valiance between 1988 and 19912.2School Survey Reinterview Results19881991Which best describes the community in which thisschool is Located?34.7(32.3 - 37.1)L-fold Index *42.4(39.645.4)GDR *School Survey Results30.4(27.9 - 32.9)37.6(34.7 - 40.9)bilingual educationPercent Yes'GDR *In the School survey, we reinterviewed fourquestions in both 1988 and 1991. Although thesequestions were virtually unchanged between the twocycles, they showed a small but statistically signifi-Index15.316.2(14.5 - 18.2)53.5(47.7 - 60.0)English as a second languagePercent Yes'31.6cant decrease iu response variance.We think a better replication of the original interview by the reinterview in 1991 caused some of thisGDR *16.1Index *decrease. Table 5 shows the reinterview results forthese questions.(14.4 - 18.1)37.1(33.1 - 41.7)14.212.1(10.5 - 14.1)45.1(39.0 - 52.3)28.313.7(12.0 - 15.8)30.1(26.3 - 34.6)extended day or before-or-after-school day-carePercent Yes16.323.0GDR9.38.8( 7.9 - 11.0)( 7.4 - 10.6)Index31.724.7(26.8 - 37.4)(20.5 - 29.7)The question, "Which best describes the community in which this school is located?" contained tencategories in 1988 and 1991.1 rural or farming community2 small city or town, not a suburb of a large city3 medium-sized city4 suburb of medium city5 large city6 suburb of large city7 very large city8 suburb of very large city9 military base or stationResponded "Yes" in original interview.Statistically significant difference between1988 and 1991.'*Mail versus Telephone Results (1991)2.3In 1991 we revised the School survey reinterviewprocedures:10 Indian reservationWe used a mail reinterview for mail respondentsand a telephone reinterview for telephone follow-upThe index of inconsistency for these categoriesranged from 21.1 to 68.8 in 1988 and from 22.2 to62.1 in 1991. The overall response variance (L-foldcases.We requested the same respondent complete the5REST COPY AVAILABLE

reinterview questions as answered the original Schoolsurvey.this result.Both procedural rhanges helped the reinterviewreplicate the original survey better. We decided tospecify the original school respondent as the reinterview respondent, because in the 1988 reinterview weinadvertently changed the reinterview's respondentselection rules by combining the Administrator andSchool reinterview questionnaires. We suspect manyadministrators had an assistant or secretary completethe original School survey. Changing respondentsbetween the original and reinterview tends to overstate response variance in the 1988 School survey.We did not conduct a controlled experiment, butsurvey by mail were eligible for the mail reinterview.These respondents were likely to be more cooperativeand answer the questions more carefully in both interviews.Respondents interviewed by mail may take time tolook up the answers to questions from records or theyreinterviewed by mail whenever possible and bytelephone when necessary, obtaining about 465 mailmail cases and 270 telephone-telephone cases. Thisanalysis covers the same four School survey questionsdiscussed in section 2.2. Under the mail-mail procedure almost all the School questions reinterviewed in1991, including the four in Table 6, displayed lowersimple response variance than under the telephonetelephone procedure.Table 6.Mail Original/Reinterview versusTelephone mmunity School LocatedGDR *19.0(16.3 - 22.2)L-fold Index *24.0(20.6 - 28.2)39.9(35.5 - 45.2)48.6(43.2 - 55.1)Bilingual EdUcationGDR *6.9( 5.2 - 9.1)Index *31.5(23.5 - 42.0)18.6(15.2 - 23.0)55.3(45.3 - 68.2)English as 2nd LanguageGDR *10.9( 8.8 - 13.6)Index24.2(19.6 - 30.1)15.7(12.6 - 19.8)33.5(26.8 - 42.3)Extended Day CareGDR *6.7( 5.1 - 8.9)Index *19.7(14.7 - 26.4)( 8.8 - 15.2)31.9(24.542.2)*Only respondents who answered the originalmay go through a more careful, but more lengthy,thought process to provide the needed facts. Incontrast, those interv:ewed by telephone may feel theinterviewer prefers a speedy response to an accurateone, so give their "best guess-timate." Research hasshown some respondents employ what survey practitioners call "satisficing." [4] In satisficing, therespondent expends just enough effort to satisfy theinterviewer.Also, respondents interviewed bytelephone may not feel free to take the time to lookup records while the interviewer is waiting on thephone [5].Mail respondents may leave more difficult oruncertain questions blank. The. Census Bureau'sinterviewers work very hard to get responses to allquestions. An interviewer may manage to obtain ananswer to a difficult question, but an unreliableanswer. Mail respondents, on the other hand, maysimply leave that question blank. We have foundhigher item non-response among the mail returns thanin the telephone follow-up cases.Mail respondents may photocopy the originalquestionnaire after completing it and refer to theiroriginal answers when completing the mail reinterview.11.5Statistically significant differencemail-mail and telephone-telephone.betweenWe observed lower response variance in bothnumerical data (for example, head counts of studentsenrolled) and non-numerical data. Royce [3] detailsresults for all School survey questions reinterviewedin 1991. We can think of four possible reasons forWe think some combination of the first threeexplanations is the most reasonable. Mail respondents, by definition, are more cooperative and motivated than those we must follow-up by telephone.And mail interviewing probably promotes morecareful responses and more use of records.We eliminated the last possibility. Mail respondents using photocopies of their original interviewscan account for only a small part of the mail-mailversus telephone-telephone differences. We concluded that only a small fraction of the mail reinterview respondents might have used photocopies, andthat these cases had little effect on the responsevariance differences between the two procedures.We hypothesized that respondents using photocopieswould give consistent answers to all questions in thereinterview. We discarded all cases where the first11 of the 21 reinterview questions matched. Thesecases accounted for only 6.5 percent of the reinterview sample and had only a negligible effect on thecomparisons.

These findings on the quality of mail responsedata have implications beyond the SASS. Perhapsmail surveys can provide as good or better data thanREFERENCES[1]some surveys now conducted by telephone or inpersonand at lower cost. For the SASS, we needto determine whether the more consistent dataachieved through mail results from the type ofrespondent who answers by mail and whether increased item non-response will cancel the gains of[2]improved consistency.[3]3. PLANS FOR THE FUTUREForsman, G. and Schreiner, 1., "The Designand Analysis of Reinterview: An Overview,"in Bierner, P. P., et al., editors, MeasurementFrrors in Surveys, 1991, John Wiley & Sons,Inc., pp. 280-281.Newbrough, J., "Report of the SASS-213(R)and SASS-4(R) Reinterview," June 28, 1989,internal Census Bureau Report.Royce, D., "1991 Schools and Staffmg Survey(SASS) Reinterview Response Variance Re-port," July 1992, internal Census BureauReinterview programs can be a valuable diagnos-tic tool to identify questions which need improvement, or which perhaps should be dropped. TheNCES and the Census Bureau are committed toproducing accurate and reliable SASS data. Theyhave heeded the reinterview's diagnosis and haveacted to make improvements with some success.What about the future? Both agencies are firmlycommitted to developing a first-class survey. The[4]Report.Krosnick, J. A., "The Impact of Satisficing onSurvey Data Quality," Proceedings of the1990 Annual Research Conference, U.S.Bureau of the Census, March, 1990, pp. 835845.[5]1992 Teacher Follow-up Survey (TFS), whichsurveyed a subsample of 1991 SASS teachers, useda probing, reconciled reinterview to learn the reasonsfor inconsistent responses.We hope not only to[6]identify the less reliable questions, but to gatherDillman, D. and Tarnai, J., "Mode Effects ofCognitively Designed Recall Questions: AComparison of Answers to Telephone andMail Surveys," in Biemer, P. P., et al.,editors, Measurement Errors in Surveys,1991, John Wiley & Sons, Inc., p. 76.Johnson, F., "Assigning Type of LocaleCodes to the 1987-88 CCD Public SchoolUniverse," National Center for Educationinformation about why inconsistencies occur.Plans for the future include:Focus at least some cognitive research on the[7]Statistics Technical Report, July 1989.Parmer, R.J., Shen, P., and Tan, A.I., "MailVersus Telephone Response in thereinterview findings.Consider using reconciled, probing reinterviewsin the SASS to lenn more about why inconsistenciesoccur.Consider expanding the mail reinterview to theTeacher and Administrator surveys.Apply quality assurance methods to data collection.Reinterview small, non-random samples to solvespecific data quality problems, for example unacceptably high pre-edit rejects.Use reinterview methods to evaluate coverage in1991Schools and Staffmg Surveys," Joint StatisticalMeetings, Boston, 1992.teacher listings (the frame of the SASS teachersample).This paper reports the general results of research undertaken by Census Bureau staff. Thc views expressed areattributable to the authors and do not necenarily reflectthose of the Census Bureau.Maintain a strong commitment to a continualcycle of evaluation and improvement of SASS questionnaires, methods, and procedures.7

DOCUMENT RESUME. ED 363 662 TM 020 784. AUTHOR Bushery, John M. And Others TITLE The Schools and Staffing Survey: How Reinterview. Measures Data Quality. PUB DATE [93] NOTE 7p.; Author is affiliated with the National Center. for Education Statistics (NCES). PUB TYPE Reports Descriptive (141)

Related Documents:

138 Pin Klao 662-8846380-90 662-4339920 139 Platinum Fashion Mall 662-1219544-8 662-1219549 140 Plearnary Watcharaphon 662-1368032-6 662-1368037 141 Pracha Niwet 1 662-9543334-7 662-5892914 142 Pradit Manutham 662-5150506-10 662-5150511 143 Pratu Nam Pathum Wan 662-2080112-7 6

Tishomingo County Department of Human Services 662-423-7060 Child Support Enforcement 662-423-7020 Economic Assistance 662-423-7041 Family & hildren's Services Tishomingo County Emergency Mgmt & Floodplain Mgmt 662-423-7028 Tishomingo County Health Department 662-423-6100 Tishomingo County High School 662-423-7300

May 07, 2021 · 106 Carrier Hall 662-915-7191 ymnajjar@olemiss.edu Office Contact: Lynne Trusty 106 Carrier Hall 662-915-7191 lmtrusty@olemiss.edu Chair: Dawn Wilkins, Ph.D. 203 Weir Hall 662-915-7309 dwilkins@cs.olemiss.edu Office Contact: Jennifer Vaughn 201 Weir Hall 662-915-7396 dept@cs.olemiss.e

Insurance Associates Of Rankin County, LLC Brandon (601) 825-5242 (601) 825-5409 Insurance Associates of Starkville, LLC Starkville (662) 324-1055 (662) 324-6275 Insurance Managers, Inc. Tupelo (662) 841-0411 (662) 844-2532 Insurance Protection Specialists, LLC Flowood (601) 992-4040 (601) 992-1334

The mission of the Independent Insurance Agents of Mississippi, working in the public's best interest, . Embrey Insurance Agency Coldwater (662) 622-7225 (662) 622-7226 . Insurance Solutions of MS, Inc. Meridian (601) 693-2336 (601) 693-1568 Jenkins Insurance Agency Hernando (662) 429-7093 (662) 429-7095

Tishomingo County Career and Technical Center Mr. John-Grady Taylor, Director (O) 662-438-6689 662-438-6777 FAX 1421 Hwy. 25 North Tishomingo, MS 38873 Alternative School Supervisor Mr. Van Roberts (O) 662-438-6800 662-438-6321 FAX P. O. Box 90, 34A Monroe Street Tishomingo, MS 38873

2 bizhub 423/363/283/223 Konica Minolta has developed a multifunction device with the user in mind. The bizhub 423/363/283/223 cuts your costs of ownership and reduces your impact on the environment while you do business. Plus the new bizhub 423/363/283/223 black-and-white multifunction device boasts

Animal nutrition, with emphasis on dairy cows. Submitted by Alimuddin Naseri, Afghanistan: alimuddin.naseri@akdn-afg.org Page 6 Phosphorus (P) P is used in bone formation, in close association with Ca and vit.D. In addition, P has more known functions in the animal body than any other mineral element. Deficiency symptoms