Home – FAQS
PERFORMANCE ASSESSMENT OF STANDARDS AND SKILLS (PASS)
What is Performance Assessment of Standards and Skills (PASS)?
PASS is a standards-based assessment tool developed by GRACE which is designed to measure the proficiency level of students in the competencies and standards set by the Department of Education (DepEd).
What type of test is PASS?
PASS is a summative and criterion-referenced test that assesses student performance against a set of predetermined learning standards based on the DepEd K-12 curriculum guides. It is administered via paper-and-pencil format with structured (multiple-choice) and unstructured questions (i.e. constructed response type and essay questions).
What are the unique features of PASS?
PASS aims to evaluate student growth and evaluates the extent to which they have learned a specific set of knowledge and skills, with concise written descriptions of what students are expected to know and able to do at a specific grade level. It also identifies specific strengths and weaknesses of students in line with their performance in the various content areas of the assessment.
The PASS Professional Development (PD) trainings are conducted to strategically address needs and concerns of the client schools by providing relevant and evidence-based workshops geared towards knowledge, sharing, and capacity-building. It also aims to enhance the capabilities of teachers in providing effective teaching and strategic classroom instruction.
What are the subjects and grade levels covered in PASS?
PASS is offered to Grades 1-10 students and it covers the five (5) major subject areas of the K-12 curriculum namely: English, Math, Science, Filipino, and Araling Panlipunan. It is the prerogative of the client school to choose which subject/s and grade level/s they would like their students to be tested on.
How were the items/questions for PASS formulated?
To ensure the quality of its assessment tools, GRACE follows standard and stringent processes of test development. Anchored on a conceptual framework using the Cognitive Process Dimensions of Anderson and Krathwohl, a team of Subject Matter Experts (SMEs) from various fields and disciplines were commissioned to develop PASS items based on the Table of Specifications (TOS) formulated by GRACE. The initial items were then subjected to an external item review by another panel of SMEs provided by Phoenix Publishing House. After a rigorous process of content validation, the items were pilot tested with selected private schools across the country. Data collated from the pilot test was used to conduct item analysis and establish PASS psychometric properties in order to develop the final set of items.
Are the tests valid and reliable?
PASS tests were tested in for internal consistency reliability and for construct validity. Cronbach alpha values on the various scales of the PASS tests range from 0.5 to 0.8. The results of correlational studies conducted by GRACE indicates that PASS tests are scientifically useful indicators of proficiency of achievement tasks appropriate for their grade levels, with clear evidence shown for the construct validity. In addition, convergent validity in studies correlating PASS scores with teacher-assessment scores was also established and contrast group studies on the PASS tests show significant differences in the performance of low and high groups at alpha = .05
How did GRACE come up with the proficiency levels for PASS?
Despite the release of a more recent grading system per DO 8, s. 2015, GRACE still uses the proficiency levels in DO 73, s. 2012. The grading system stated in the previous order is more consistent with standards-based assessment and its description levels can accurately demonstrate understanding and mastery of knowledge and skills that students are expected to learn. The proficiency level scales of PASS are as follows, along with the DepEd grading scale equivalent:
What are the procedures for availing of the PASS assessment services of GRACE?
Once the client school decides to avail of the assessment service, GRACE immediately coordinates with the school administrator/principal through an agent from Phoenix Publishing House (PPH) to formally sign a Memorandum of Agreement (MOA) effective for one (1) year. GRACE will then contact a representative from the client school for details on the test administration.
Who will administer the test?
In practice, the client school has two (2) options for test administration. The school may opt to request their teachers to serve as proctors. In such case, GRACE will set a 1-hour briefing for teachers scheduled two (2) days before the testing date to orient the teachers regarding the PASS test administration policies and procedures. A proctor’s fee will be provided for the assigned teachers and the distribution of payment will be coordinated with the school administrator/principal. Teachers from other grade levels teaching subjects other than the subjects matter covered in the PASS test will serve as proctors.
The school may also request for GRACE to provide the test proctoring services. In this case, GRACE will provide trained examiners and proctors to administer the test to the students of the school.
What is the difference between the pre-test and post-test? Why is it important to take both?
The PASS pre-test is typically administered at the beginning of the school year (June) to measure the pre-existing knowledge of students, whereas the post-test is conducted at the end of the school year (March) to compare the knowledge attained after instructional interventions done by the teacher.
Having both the pre-test and the post-test would result in a concise and effective evaluation of student progression and learning throughout the school year wherein comprehensive baseline information of their prior knowledge and progression after instruction or further intervention are reported. Student progress/difficulty can therefore be monitored by teachers that can guide them
in formulating informed and relevant instruction inside the classroom throughout the academic year.
What reports will the school receive after the test administration?
After the test administration, GRACE will provide the client school with comprehensive PASS test reports with descriptions of the grade level standards and domains taken verbatim from the DepEd K-12 curriculum guides. The reports are divided into three categories:
How are the reports made?
The PASS test uses Optical Mark Recognition (OMR) answer sheets for scanning and computation of raw scores based on shaded responses. Blank responses in are therefore omitted and automatically scored as “zero”. The scanner then produces a soft copy of datasets to be scored and processed into reports by licensed Psychometricians (RPms) through the use of the PASSRGS program. A Quality Assurance (QA) team then reviews the generated reports to be printed out and delivered to the schools.
How are essay responses scored?
Essay responses are manually scored by Subject Matter Experts (SMEs) using holistic scoring rubrics on selected grade levels. The scores are then encoded in the datasets and endorsed to the Psychometricians (RPms) for processing.
When will the reports be delivered?
The reports will be delivered fourteen (14) working days after the test administration for Metro Manila schools and twenty-one (21) working days for provincial schools.
What are the protocols for absentee testing?
During test administration, GRACE consolidates the names of students that were unable to take the test, and endorse them to the school principal/administrator for absentee testing. It is upon the prerogative of the school to allow another examination date for the absentees; data from the absentee test results however would no longer be included in the class and school reports. In addition, their Individual Profile Report (IPR) will be delivered within the appointed working days after the said date.
Note: The school must be in close coordination with GRACE in finalizing the testing date and logistics for the absentees. GRACE will then be sending testing personnel to administer the test. For regional clients, area coordinators or PPH field personnel are the ones appointed for test administration.
INDIVIDUAL’S PERFORMANCE IN LITERACY AND NUMERACY (IPLAN)
What is the Individual’s Performance in Literacy and Numeracy (IPLAN)?
IPLAN is a diagnostic and formative tool designed to assess the foundational skills of students in literacy and numeracy based on the Department of Education K-12 curriculum. The assessment is criterion-referenced; therefore it evaluates performance against a set of
predetermined learning standards.
What are the unique features of IPLAN?
IPLAN is performance-based and administered one-on-one to students using an interview style; its mode of assessment includes the use of reading materials and manipulatives. Testing conditions and behaviors of the students during testing are also reported as such factors may affect their performance during the conduct of the assessment (Aylward & Carson, 2005).
What is the value of IPLAN?
IPLAN indicates the skill strengths across the content areas and sub-areas of literacy and numeracy. It also shows whether or not a student is at risk in his/her foundational skills and pinpoints specific areas of difficulties that need further interventions.
What are the subjects and grade levels covered in the IPLAN?
IPLAN measures the Literacy skills of students in major content areas of oral language, phonological awareness, phonics and word recognition, grammar, vocabulary development, reading fluency and comprehension as well as writing/composition. For numeracy, it measures skills in number and number sense, geometry, patterns and algebra, measurement as well as statistics and probability.
The assessment is offered to Kinder and Grades 1, 2, and 3 students. It also consists of a pre-test at the beginning of the school year (June) to measure the pre-existing knowledge, and a post-test at the end (March) to compare the knowledge attained after instructional interventions done by the teacher.
Why assess literacy and numeracy?
Literacy and numeracy are fundamental and foundational skills that constitute key information processing, becoming a foundation for the development of higher-order cognitive skills and a pre-requisite for attaining and understanding specific domains of knowledge.
High literacy and numeracy skills are also associated with personal, social and economic wellbeing and are essential in a broad range of contexts in everyday life. These skills are also increasingly significant in the labor market and can hugely affect employability, as these skills underpin a much wider set of work-related competences. (EU Skills Panorama, 2014)
Why assess early?
Literacy and numeracy skills are crucial as they are used in many aspects of our lives, are often used in conjunction with other skills, and are used to begin, establish, and maintain lifelong learning (Getting the Basics Right: Quality Primary Education in the North Pacific, 2015). Early education is also essential for assessment as critical skills and knowledge learned throughout the years can be a determining factor for future performances (Langham, 2009). These skills must therefore be acquired early on in order to achieve success in life.
Lack of such without early diagnosis and intervention of problem areas may lead to student’s falling farther behind in all future learning as they progress through school.
How were the items/questions formulated? Are the tests valid and reliable?
Grace followed a standard and stringent process during the test development of IPLAN. Anchored on a conceptual framework, a team of Subject Matter Experts (SMEs) provided by Phoenix Publishing House developed items based on the Test of Specifications (TOS) formulated by GRACE. The initial items developed were then subjected to an external item review panel of Subject Matter Experts (SMEs). After rigorous process of content validation, the items were pilot tested in selected schools across the country. The pilot test data were used to do item analysis to establish IPLAN’s psychometric properties and formulate the final form of IPLAN.
How did GRACE come up with the proficiency levels for IPLAN?
The proficiency levels used during the development of the IPLAN assessment are patterned after the proficiency levels stated in the DepEd Memorandum Order No. 73 and in the Early Grade Reading Assessment (EGRA) Toolkit, which is an assessment for basic foundation literacy skills in early grades, and was handled by the Development Strategists International Consulting Inc. (DSIC). The EGRA project was also appointed by the Educational Section of the Asia Development Bank (ADB). Listed below are the proficiency levels used for IPLAN:
What are the procedures for availing of the assessment services for IPLAN?
Once the client school decides to avail of the assessment service, GRACE immediately coordinates with the school administrator/principal through an agent from Phoenix Publishing House (PPH) to formally sign a Memorandum of Agreement (MOA) effective for one (1) year. GRACE will then contact a representative from the client school for details on the test
administration.
How is IPLAN administered?
IPLAN It is typically non-written and conducted over approximately 15 to 20 minutes per student. It is administered one-on-one, with a particular pupil-assessor seating arrangement conducive for test administration. For Literacy, the student must be seated beside the assessor for proper communication and execution of reading and listening tasks. On the other hand, students are to be seated across the assessor when it comes to Numeracy as they would be using counters and manipulatives for the entirety of the assessment.
Establishing rapport with the student is an important aspect of test administration as informal behavioral observations are made throughout the entire assessment process via the Assessor’s Informal Observation Questionnaire. A scoring sheet for Literacy/Numeracy per student is also provided for the assessor to score the appropriate response/s made by the student based on the analytic rubric per item.
Who will administer the test?
GRACE will be providing assessors and site coordinators to administer the test to the school. Subject Matter Experts (SMEs) and licensed professionals who underwent extensive screening and training for the IPLAN test administration will be the ones to assess the students, whereas site coordinators are in-charge of monitoring the overall progress of the assessment, and are responsible for accompanying the students to the testing room and back to their classrooms end of every test administration. Site coordinators must be accurate in timing the assessment and ensure that it is within schedule.
How many students can be assessed by the assessor?
In administering IPLAN, an assessor can handle fourteen (14) students per day. Thus, the number of assessors will depend on the number of students that will undergo the IPLAN assessment.
How are tasks administered on the following content areas of literacy: Oral Language, Reading and Comprehension, and Writing/Composition?
IPLAN requires each student to perform specific tasks as instructed by the assessor throughout the entire test administration. In Oral Language, the assessor requires the student to interact and talk about his/her personal experiences and/or text listened to or read.
In Reading and Comprehension, the assessor lets the student read aloud the grade level texts within the reading materials provided. Lastly, in Writing/Composition, the assessor instructs the student to express his/her ideas effectively in formal and informal compositions.
How are responses on these tasks scored?
Constructed responses made by students on each task are scored by the appointed assessor using holistic rubrics developed by GRACE Subject Matter Experts (SMEs).
How are the reports made?
After test administration, the scoring sheet/s and Assessor’s Informal Observation Questionnaire are encoded into datasets to be scored and processed into reports by licensed Psychometricians (RPms) through the use of a scoring program developed by GRACE. A Quality Assurance (QA) team then reviews the generated reports to be printed out and delivered to the schools. A compact disc (CD) with the soft copy of the results will also be included in the reports.
What reports will the school receive after the test administration? When are they going to receive the reports?
After the test administration, GRACE will be providing a comprehensive individual, class, and school report for IPLAN Literacy/Numeracy which consists of the following results:
When will the reports be delivered?
The reports will be delivered fourteen (14) working days after the test administration for Metro Manila schools and twenty-one (21) working days for provincial schools.
What are the protocols for absentee testing?
During test administration, GRACE consolidates the names of students that were unable to take the test, and endorse them to the school principal/administrator for absentee testing. It is upon the prerogative of the school to allow another examination date for the absentees; data from the absentee test results however would no longer be included in the class and school reports. In addition, their Individual Profile Report (IPR) will be delivered within the appointed working days after the said date.
Note: The school must be in close coordination with GRACE in finalizing the testing date and logistics for the absentees. GRACE will then be sending testing personnel to administer the test. For regional clients, area coordinators or PPH field personnel are the ones appointed for test administration.
READINESS ASSESSMENT FOR SENIOR HIGH (RASH)
What is Readiness Assessment for Senior High (RASH)?
RASH is an aptitude assessment tool that provides predictive information regarding the potential for success of Grade 10 students in the tracks and strands of the Department of Education (DepEd) Senior High School (SHS) Program.
What are the unique features of RASH?
RASH enables students to explore possible career paths along the tracks and strands of the SHS curriculum based on their core competencies and occupational interests. It also identifies academic proficiency within the (5) applied track subject areas of the SHS curriculum, highlighting specific strengths and weaknesses for each domain. The results can also be used to assist schools enhance their SHS curricula, and formulate career planning interventions for the educational and vocational guidance of their students.
Despite having the same premise of assessing students for identifying their potentialtrack/strand in the SHS Program, there are many differences between the National Career Assessment Examination (NCAE) and RASH as summarized below:
What are the tests that will they take for RASH? Is it necessary to take them all?
The framework of RASH consists of three (3) tests that would assess student competencies and skills needed for the SHS program. The Assessment of Achievement Potential is an achievement test that measures the standards and competencies of students in the five (5) applied track subjects of the K-12 curriculum which students must take regardless of their chosen specialization. The Measurement of Core Competencies aims to determine their current skills level with regards to the specialized tracks and strands of the SHS program. It also includes a 10-item artistic inclination test that measures their creativity through visual (drawing) and written responses.
Lastly, the Occupational Interest Profile identifies their career preference in pursuing a particular track/strand, given their personal characteristics. Accomplishing the entire assessment therefore is mandatory.
How were the items/questions for RASH formulated? Are the tests valid and reliable?
For Achievement Potential, Focus Group Discussions (FGDs) with Subject Matter Experts (SMEs) from various fields and disciplines were held to identify and determine the power standards to be assessed in the SHS curriculum. Core Competencies were identified based on a comprehensive review of literature and the SHS curriculum. The framework for the Occupational Interest Profile was derived from John Holland’s Occupational Themes as reflected in Cattell’s 16 Personality Factors Test (16-PF). Items developed for each component were subjected to a series of extensive reviews by panels of SMEs for content validation and psychometric item analysis to ensure item quality and functionality.
How did GRACE come up with the results of RASH?
Results from the Measurement of Core Competencies and Occupational Interest Inventory obtained by students are used in order to plot their potential tracks/strands, with each strand tagged to a set of needed skills and suggested occupational fields that they could pursue. For instance, a student could fall under the ABM strand if he/she attained high scores in clerical ability, verbal reasoning, numerical reasoning and entrepreneurial thinking. Whereas, having high occupational interest scores under the enterprising theme would determine his strength (interest) in pursuing the ABM strand. An overall Assessment of Achievement Potential score is also provided for students to determine their readiness for the five (5) applied track subject areas of the SHS curriculum by measuring their standards and competencies for each subject. Standards-based assessment was used in order to measure the performance of students relative to their proficiency with the competencies (knowledge, skills, and abilities) across subject areas set in the K-12 standards.
Do the results indicate where the student will be most appropriately placed?
The results of each test are interconnected; scores are statistically calibrated to produce the recommended tracks/strands based on the three (3) highest categories of core competencies and occupational interest obtained. The achievement test highlights the proficiency level of students per applied track subject area of the curriculum. By using information that are specifically geared towards their achievement, competencies, and career interest based on personality, students can be guided in deciding what track/strand is ideal for them.
What if the preferred SHS track or strand of a student does not appear in any of the RASH results?
Upon entering Grade 11, students are given the opportunity to choose only one specialization (track/strand) which will not only define what subjects they are about to take but will also serve as a foundation for their future career. The result from RASH provides them with a comprehensive framework of what they need to learn (among the track subjects), what skills they currently have, and what kind of work environment (occupation) suits their personality. It also provides information that enables students to make better career decisions that would greatly impact their potential for occupational success and satisfaction in the future.
It is important to note however that there are also other factors to consider in choosing a particular track/strand such as school capacity, availability of tracks and strands in their area, and in-demand jobs in their district.
When is the testing period for RASH?
RASH is conducted once during the mid-school year (October to November) to coincide with the testing season for NCAE.
Who will administer the test?
GRACE will be the one to provide the proctoring services with trained examiners and proctors to administer the test to the students of the school.
How long does it take to finish the test? Is it possible to adjust the time allotted?
It is highly suggested to have the test administration of RASH be divided into two (2) working days to avoid test fatigue since it would usually take five (5) to six (6) hours without break. However, the school must be in strict compliance with the instructions provided by GRACE, including the sequence and time allotted of each test to ensure test standardization.
What reports will the school receive after the test administration?
Each student will be provided with a Senior High Potential Profile to assist them in determining the most suitable track/strand that their students should pursue. Each profile report consists of the following:
How are the reports made?
RASH uses Optical Mark Recognition (OMR) answer sheets for scanning and computation of raw scores, therefore blank items are automated as “missing” responses. The scanner then produces a soft copy of datasets to be scored and processed into reports by licensed Psychometricians (RPms) through the use of the a scoring program developed by GRACE. A Quality Assurance (QA) team then reviews the printed reports and delivered to the schools.
How is the Artistic Inclination Test scored?
Responses on the artistic inclination test are manually scored by trained Subject Matter Experts (SMEs) using holistic scoring rubrics. Total scores per domain are then encoded in the datasets endorsed to the Psychometricians (RPms).
When will the RASH reports be delivered?
The reports will be delivered twenty-one (21) working days after the testing date for both Metro Manila and provincial schools.
What are the protocols for absentee testing?
During test administration, GRACE consolidates the names of students that were unable to take the test, and endorse them to the school principal/administrator for absentee testing. It is upon the prerogative of the school to allow another examination date for the absentees; data from the absentee test results however would no longer be included in the class and school reports. In addition, the Senior High Potential Profile will be delivered twenty-one (21) working days after the said date.
Note: The school must be in close coordination with GRACE in finalizing the testing date and logistics for the absentees. GRACE will then be sending testing personnel to administer the test. For regional clients, area coordinators or PPH field personnel are the ones appointed for test administration.
Assessment of Readiness for College and Career (ARC)
What is Assessment of Readiness for College and Career (ARC)?
ARC is a standards-based aptitude assessment tool geared towards measuring critical thinking and the necessary knowledge and skills for pursuing higher education and/or a specific career. The assessment tool aims to measure the academic readiness of Grades 11-12 students based on the College Readiness Standards (CRS) of the Commission on Higher Education (CHED), occupational skills, and interest among occupational fields from Holland’s Vocational Themes.
What are the unique features of ARC?
ARC is composed of an achievement test, aptitude test, and interest inventory that aims to assesses knowledge, skills, abilities, and personal preferences of students in relation to different occupational fields. It specifically assesses student knowledge among the core subject areas indicated by the CRS and provide information on their preparedness for the academic qualifications higher education and/or skills in pursuing a certain career path. The aptitude test and interest inventory results are also interlinked to provide a comprehensive individual profile report to the students.
ARC focuses on assisting schools and guardians in the college and career exploration and planning through the assessment results, and by providing the Self-Help: Know and Explore Sheet as well as result and report interpretations. Furthermore, GRACE provides result/data interpretation services in collaboration with the school, to better serve ARC’s purpose – assist the students as much as it can.
What are ARC's subtests?
ARC consists of three (3) subtests: achievement test, aptitude test, and interest inventory. The achievement test assesses the proficiency of students within the seven (7) entry-level general education courses: English, Mathematics, Science, Filipino, Social Studies, Humanities, and Literature. The aptitude test and the interest inventory assess student competency and preferences among the different sets of skills and activities related to Holland’s Vocational Themes or occupational fields: Realistic, Investigative, Artistic, Social, Enterprising, and Conventional.
How was ARC developed? Where is its content based on?
ARC was conceptualized through a framework established by GRACE, along with technical advisers who are well-known in their respective fields. The content of all the subtests were developed through a series of Focus Group Discussions (FGDs) and item-writing workshops with Subject Matter Experts (SMEs) from various fields and disciplines. Items were also identified through the curriculum standards of CHED and Holland’s Vocational Themes. Initial items were then subjected to a series of extensive reviews by panels of SMEs for content validation, and a pilot test was done for the tests of reliability and validity. Psychometric analysis of the assessment was then made to ensure its quality and functionality.
What information could ARC results provide to its stakeholders?
ARC results provide students with information regarding their strengths among the subject area domains, and pinpoints specific competencies that needed critical attention and further development in order to be academically prepared for their preferred course. It also provides students information on the alignment of their level of interest for specific occupational fields and their measured competencies in these fields.
A detailed description of occupational fields and information regarding the occupations under each field are also included in the results. This information can also be used by guidance counselors and parents in helping students undertake career exploration and planning in helping students undertake college and career exploration and planning.
The school on the other hand, is provided with information regarding the performance of each class on the achievement and aptitude test, as well as the frequency of students according to their preferred occupational field, and course or program.
What are the specific components of the ARC report?
The result of ARC consists of an Individual Profile component which could be used by everyone involved in the process of the students’ career planning. The School and Class report component aims to assist schools in ensuring that they address not only individual student guidance and learning needs, but also the learning needs of the class and school as a whole. The specifics of the reports are provided below:
What are the differences between ARC and the National Career Assessment Examination (NCAE)?
ARC and NCAE share the same idea when it comes to its goal – provide students an idea on what are their possible choices for career exploration and planning in relation to their interest and/or aptitude. However, ARC aims to provide a more detailed, comprehensive result and information to the students as well as their parents/guardians and the school as compared to NCAE. This is to better help and encourage collaboration among the stakeholders in relation to the goal. The main difference between the ARC and the NCAE then lies in the information and service it is willing and aims to provide as illustrated below:
When is the testing period for ARC?
ARC can be or is recommended to be administered once during the mid-school year (October-November).
How long does it take to finish the test? Is it possible to adjust the time allotment for each subtest?
ARC is administered for eight (8) hours which already includes a one (1) hour lunch break; (15) morning and afternoon breaks, and a total of (25) minutes for giving of instructions and distribution of test materials within the classrooms. The provided time allotment is the most ideal when administering ARC, therefore it is highly advised not to exceed these time allotments or make unnecessary adjustments.
The school may opt to have the test administration of ARC be divided into two (2) working days to avoid test fatigue. However, strict compliance with the instructions provided by GRACE must be made, including the sequence and time allotted of each test to ensure test standardization.
How are the reports made?
ARC uses Optical Mark Recognition (OMR) answer sheets for scanning and computation of raw scores based on shaded responses. Blank responses in are therefore omitted and automatically scored as “zero”. The scanner then produces a soft copy of datasets to be scored and processed into reports by licensed Psychometricians (RPms) through the use of. A Quality Assurance (QA) team then reviews the generated reports to be printed out and delivered to the schools.
How are essay responses scored?
Essay responses are manually scored by Subject Matter Experts (SMEs) using holistic scoring rubrics on selected grade levels. The scores are then encoded in the datasets and endorsed to the Psychometricians (RPms) for processing.
When will the ARC reports be delivered?
The reports will be delivered twenty-one (21) working days after the testing date for both Metro Manila and provincial schools.
What are the protocols for absentee testing?
During test administration, GRACE consolidates the names of students that were unable to take the test, and endorse them to the school principal/administrator for absentee testing. Students are considered absent if:
- they were not able to take all three subtests (Achievement, Aptitude, and Interest Inventory)
- they missed one of the subtests and/or specific parts of a subtest (e.g. a subject in the Achievement test, one of the Parts of the Aptitude test).
It is upon the prerogative of the school to allow another examination date for the absentees; data from the absentee test results however would no longer be included in the class and school reports. In addition, their Individual reports will be delivered within the appointed working days after the said date.
Note: The school must be in close coordination with GRACE in finalizing the testing date and logistics for the absentees. GRACE will then be sending testing personnel to administer the test. For regional clients, area coordinators or PPH field personnel are the ones appointed for test administration.