Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Data Management Impact on Practitioner Decision Making: Case Studies, Study notes of Database Management Systems (DBMS)

A capstone report by SDP Fellows that explores the impact of data management on decision making in education agencies. It presents three case studies that illustrate the data management life cycle, including data collection, cleaning and processing, and analysis. The report provides recommendations to education agencies and serves as a guide for future fellows and researchers. The document reflects the views of the authors and not necessarily those of SDP or the Center for Education Policy Research at Harvard University.

Typology: Study notes

2021/2022

Uploaded on 05/11/2023

palumi
palumi 🇺🇸

4.2

(13)

9 documents

1 / 35

Toggle sidebar

Related documents


Partial preview of the text

Download Data Management Impact on Practitioner Decision Making: Case Studies and more Study notes Database Management Systems (DBMS) in PDF only on Docsity! SDP FELLOWSHIP CAPSTONE REPORT How Does Data Management Impact Practitioner Decision Making?: Illustrations from Three Case Studies Maura Bonanni, Achievement First Kate Drake Czehut, ROADS Charter High Schools Brad Gunton, New Visions for Public Schools Cohort 4 Fellows DATA MANAGEMENT SDP Fellowship Capstone Reports SDP Fellows compose capstone reports to reflect the work that they led in their education agencies during the two-year program. The reports demonstrate both the impact fellows make and the role of SDP in supporting their growth as data strategists. Additionally, they provide recommendations to their host agency and will serve as guides to other agencies, future fellows, and researchers seeking to do similar work. The views or opinions expressed in this report are those of the authors and do not necessarily reflect the views or position of SDP or the Center for Education Policy Research at Harvard University. DATA MANAGEMENT 3 Figure 1: The Data Management Life Cycle The sheer quantity of potentially useful information generated by members of a school community over the course of a day defies human capacity for data entry. Therefore, school or district leaders must decide which pieces of raw data to collect. The second tab “collect data” draws attention to this task. According to scholarship geared towards social innovation (Skillern et al. 2007), this decision-making process should be guided by the organization’s theory of action or theory of change. For example, if the focus of a district is to increase student learning by fostering strong social ties, then the district would aim to standardize the collection of data regarding relationships and indicators of student learning. Data collection at all schools is also guided in part by federal, state and local mandates. Once the decision of which data to collect is made, schools and/or districts need to develop internal policies and practices to ensure that all of the data are being collected in a standardized and timely manner. In the first case study, Kate Czehut shares how a startup charter school network in New York City developed and implemented an internal process of data collection. Data collection is one step in the first stage of data management. Once data are collected, they need to be cleaned and processed before they can be used as the basis of analysis. Tabs three and four of the diagram represent this cleaning and processing work. This first stage of data management is DATA MANAGEMENT 4 essential because analyses conducted on incomplete datasets can lead to biased and inconsistent estimates, thus providing a foundation for decision-making that is more quicksand than cornerstone. The second stage of data management is represented by the right and left sides of the diagram. The right side of the diagram, which starts with the “Build Descriptive Tool” tab, draws attention to work done to create or customize dashboards, gradebooks and attendance records and the time and attention that stakeholders spend with the resulting descriptive tools in order to inform their decision- making. The second case study by Maura Bonanni explains how Achievement First improved on its method of producing and using descriptive tools in decisions to promote or retain students. The left side of the diagram, beginning with the tab “Explore & Analyze,” reflects situations in which the desired outcome is less a visualization of the data and more a rigorous analysis of the data. For example, a superintendent might be weighing the pros and cons of a remediation program and want to know how students who have gone through the program have fared. These tabs acknowledge the fact that more sophisticated quantitative methods than descriptive statistics are needed to address this kind of counterfactual question to predict program impact. Lastly, the diagram highlights the connection between stakeholder decisions and the day-to-day activities in schools. The third case study, by Brad Gunton, illustrates this stage by showing how data management at the district level can positively impact the paths that individual high school students take through high school. II. Case Studies A. Collecting Data at ROADS 1. Agency Profile In December 2011, ROADS Charter High Schools existed solely as an audacious plan to launch a network of schools that focus on reinventing options for students who had been, as ROADS’ visionary Cami Anderson likes to say, “derailed by the system.” (ROADS formally stands for Reinventing Options for Adolescents Deserving Success.) ROADS’ State University of New York-authorized charters called for the schools to enroll any young person age 15-17 who had earned fewer than one-quarter of the credits needed to graduate from high school regardless of where he or she was in the educational system; give preference to those who were court-involved, homeless, behind grade cohort, dropouts, or in Child Protective Services; and provide all with opportunities to grow academically, professionally and personally. The first enrollment application became available in February 2012 and by June, ROADS had DATA MANAGEMENT 5 received 1,400-plus applications for 300 available seats. Two months later, two ROADS transfer high schools opened their doors to each serve 150 youth in an NYC neighborhood with a high rate of child poverty. Now in the third year of operation, the ROADS network consists of roughly 450 students and 90 full-time employees. 2. Policy/Research Question From ROADS’ inception there has been a tremendous hunger and sense of urgency to dig into data to learn as much as possible about ROADS’ student population in order to make evidence-based decisions on where to devote time and resources. There was just one problem: not all pieces of data were being collected for all students and those being collected were not all stored in a way that provided a solid foundation for network-wide decisions. Given the importance of data collection for data analysis, the question became: How could ROADS create a process to ensure that important student data are collected and entered into the student information system in a standardized manner at the school level? This case study focuses on the task of data collection in the data management process laid out in Part I. The act of collecting data is the crucial first step allowing for data-driven decision-making. Analyses that do not account for missing data can produce biased and inconsistent estimates. When that occurs, the foundation on which decisions are to be made is no more solid than a best guess. There are three important parts to data collection. First, there is the decision of which data to collect. This determines the scope of the questions that the data can address. For example, if program participation is not tracked, then little insight can be gained into the impact of program participation on individual outcomes. Next is the decision of whose data to collect. Too frequently it appears that data are only collected for certain groups of students, which then deprives stakeholders of the opportunity for direct comparisons across student groups. When this type of situation occurs, the data might, for example, show that students in a particular intervention program scored, on average, five points higher on the post-program test than on the pre-program test. However, without the test scores of students who did not participate in that intervention, there would be no way to gauge the relative benefits of that intervention on student learning. The final step is how to collect data. The desired outcome is data points that are standardized, transparent and comprehensive. Regardless of the format in which data points are obtained, they all need to be coded for analysis. Processing and cleaning data is much more efficient if the data is standardized, and have been coded in a clear, unambiguous and pertinent way. In order for them to be DATA MANAGEMENT 8 benefit various stakeholder interests. For example, we introduced the monthly data close to principals as the path towards realizing their goal of having real-time, useful data reports. For directors of operations, we explained how the monthly data close would eliminate the stress of completing year-end compliance reporting because all of the data needed would already be in the system. In this way, we built staff investment in the process and attempted to limit compliance-driven behaviors. Three other strategies also contributed to the success of implementing a monthly data close. First, we started small. The amount of data included in the initial monthly data close spreadsheet was limited to allow school staff to ease into the process. Second, we built time into the Research, Evaluation and Data team’s schedule to allow one team member to devote one day at each school during the week of the data close to answer questions in person. Lastly, we celebrated each school’s success loudly and enthusiastically over email and through events. 4. Results/Impact Figure 3 below illustrates the greatest outcome of this project: a standardized, transparent and comprehensive database. In January 2014 when we launched the monthly data close, 66% of students at ROADS 2 and 52% of students at ROADS 1 were missing at least one pertinent piece of information. In three months, the monthly data close had largely eradicated the issue of missing data. By April, no student was missing the student information identified in the data close workbook. A complete and standardized database is the greatest outcome of this project. Figure 3: Percent of ROADS Students Missing Pertinent Data 52% 16% 3% 0% 66% 52% 8% Jan 2014 Feb 2014 Mar 2014 Apr 2014 % S tu d e n ts M is si n g D at a Timeline from Beginning of Monthly Data Close Percent of ROADS Students Missing Pertinent Data ROADS 1 ROADS 2 DATA MANAGEMENT 9 The monthly data close has also had two additional benefits. First, the project has begun to clarify roles and responsibilities around student data collection and entry at the school and network level. School-based operations staff now take more ownership of their students’ data. Second, the project has built time and structure into data collection and data reporting at ROADS. Prior to the monthly data close, data collection and entry requests came on an ad hoc basis, usually with immediate deadlines. This caused frustration all around: stakeholders could not get the answers and evidence they wanted on a timely basis and staff were harried to produce on unrealistic timelines. The monthly data close clarifies the timeline for collecting and entering data as well as the schedule for monthly reports. Now, reporting is done on a monthly basis after the data close. Developing a process to ensure that student data are collected and entered into our databases in a standardized manner opens the door for using data to make evidence-based decisions. ROADS is confident that by rolling out data reports that are used in the school by school staff to track progress towards our goals and make decisions on actions that effect our students, we will increase incentives to complete the data collection and data entry work. The next case study addresses the second stage of data management by focusing on how data processing can improve a district’s ability to deliver practical and useful data tools for stakeholders. B. Creating User Friendly Tools by Improving Data Infrastructure at Achievement First If data collection practices form the foundation of district-level analyses and tools, data storage solutions comprise the framing. How a district chooses to store and structure its data in many ways determines the impact of a tool or analysis. If data storage decisions are made without a deep understanding of and empathy for end-user needs, the resulting structure is unlikely to deliver tools or analyses that are practically useful for schools. 1. Agency Profile The mission of Achievement First (AF) is to deliver on the promise of equal educational opportunity for all of America's children. We believe that all children, regardless of race or economic status, can succeed if they have access to a great education. Achievement First schools provide all of our students with the academic and character skills they need to graduate from top colleges, to succeed in a competitive world and to serve as the next generation of leaders in our communities. DATA MANAGEMENT 10 Achievement First aims to create public charter schools that close the urban-suburban achievement gap, while also looking to partner with other like-minded, reform-oriented organizations and traditional school districts to maximize our collective impact. Our theory of change is that by creating the equivalent of an urban public school "district," Achievement First can serve as proof that closing the urban-suburban achievement gap is possible at district scale and inspire broader reform. Achievement First is focused on continuing to close the achievement gap and serving as an example for other public charter schools and traditional public school districts. We will continue our work until every child is given access to a great education and enjoys the real freedom that flows from that opportunity. 2. Policy/Research Question Traditional districts have long said that the real problem is not how to create a singular successful school but how to create a district of successful schools. By 2017, AF plans to operate 38 schools serving over 12,000 students (when fully grown, these schools will be able to serve nearly 16,000 students). This will make AF’s total student body larger than that of 95% of districts and serve more students eligible for Free and Reduced Price Lunch (FRPL) than 97% of districts. In order to be a proof point, a “district” of this scale must be financially, humanly, and institutionally sustainable. This includes our human and technical systems for capturing, analyzing, and sharing data. We believe that the smart use of data will dramatically increase the effectiveness of our teachers and school leaders and that the strategic use of systems will amplify the impact of our people. We accomplish this by: 1. Ensuring AF has accurate, timely, and insightful data to make strong decisions focused on the increasing student achievement; and 2. Ensuring AF has the infrastructure, systems (process + technology), and support in place to make us more effective and more efficient so that we can scale successfully. Until recently, AF’s data capture systems and infrastructure limited our ability to meet the needs of key organizational decision makers. Data reports and tools were static, standardized, and siloed, excel- based documents that were released to schools on fixed release dates. The resources required to create these data reports and tools were significant, and the creation processes were far from scalable. More painful still was the feedback from schools. Although these tools were intended to support our schools, DATA MANAGEMENT 13 Rank School % of Students Not Meeting Promotional Criteria 1 AF Endeavor Elementary 80% 2 AF Brooklyn High School 64% 3 AF Hartford High School 63% 4 AF Endeavor Middle School 63% 19 AF Hartford Middle School 19% 20 Amistad Academy Middle School 18% 21 Amistad Academy Elementary School 16% 22 Elm City College Prep Middle School 13% Figure 1: 2012-2013 Promotional Criteria Consistency Figure 2: 2012-2013 Withdrawal Rates for Students Recommended for Retention PID Version 2.0: Understanding & Addressing User Needs Given the organizational importance of the PID process, AF committed to identifying and addressing the issues that we experienced during the first year of implementation throughout the 2013– DATA MANAGEMENT 14 14 school year. Given our initial missteps, we knew we needed a more school-centric, iterative approach to developing these tools. The agreed upon timeline and phases of work were: Figure 3: Project Phases and Timeline We designed the project to surface and clarify end-user needs far better than we had prior to building the PID Version 1.0 reports. Because so many schools had built alternate PID processes, we began by asking schools to send us examples of the reports they had created. After combing through over 20 different sets of reports, we were able to identify common needs and use those as a jumping off point for our first round of qualitative interviews. While we hoped that these initial interviews would result in a robust set of requirements, we had also learned that our process and timeline needed to allow for feedback and iteration. We planned for a second round of input to take place while we were building out our data infrastructure to allow us to get additional input from schools and refine our plans before they were too far along. 4. Results / Impact We will not be able to definitively assess the success of the improved PID tools and processes until the conclusion of the 2014–15 school year. The 2013–14 school year has been dedicated to better understanding stakeholder needs and developing our data infrastructure in support of those requirements. These technical investments, coupled with our deeper understanding of school needs, have allowed us to create a new set of tools that stakeholders ranging from school-based PID owners to regional superintendents have responded to with overwhelming enthusiasm. DATA MANAGEMENT 15 Lessons Learned & Data Infrastructure Impact After speaking with over 30 school-based school leaders and PID-process owners, we found nearly all the feedback fell into one of two categories: 1. Reports need to be on-demand and high-stakes data must be current While our initial requirements indicated that the PID tool would work if it was published on a trimester or quarter basis, static reports were insufficient in practice. Although all schools began each year intending to have key family engagement meetings on the same days, over the course of the year, unforeseen circumstances such as snow days or scheduling conflicts with co-located schools meant family engagement nights shifted. As timelines shifted, so did the corresponding data entry deadlines. Furthermore, some schools simply wanted to have conversations with parents more regularly or wanted more prep-time for family conversations, while others wanted to give teachers as much time to enter grades as possible before looking at final PID data for a trimester or quarter. Complicating the issue further was our approach to collecting certain types of promotional indicator data. We relied heavily on Microsoft Excel spreadsheets as data collection tools. This simply was not a scalable approach—data management and cleaning were labor-intensive, tedious tasks. This process also meant schools could only see the data they had submitted to the network in reports, even if they had more current data at the school site. For example, the initial Elementary School PID reports were produced every trimester to align with the family engagement nights. However, F&P data —a key promotional indicator measuring elementary reading ability—is conducted during five windows over the course of the year, while the network collected and cleaned the data at only three points during the year. This meant a school’s PID report often had out-of-date information and might incorrectly categorize students. Given the stakes of the conversations with parents, schools need current, accurate information on hand. We needed to create nimble, cloud-based data capture tools if we wanted our PID reports to have relevant data and be more useful to schools. 2. We must enable customization or explain (and gain buy-in for) standardization Although there are many shared practices that are common across all AF schools, we do allow for a significant level of school autonomy. The initial PID process and supporting tools were not built to accommodate the significant differences across schools. For example, while we were in agreement that failing courses should be included as a promotional consideration, our schools DATA MANAGEMENT 18 the right time, and in the right hands, a student’s trajectory through school can radically alter depending on how they take advantage of opportunities for growth and advancement. We ensure that the data we collect and analyze is used to support students’ progress through high school and into their post- secondary careers. 2. Policy/Research Question While New Visions for Public Schools has been successful at graduating high needs students at above the City average, a closer look at our last few graduating classes reveals that students are still failing to graduate despite showing academic promise. As just one example: 5,259 students in our Class of 2013 passed all of their freshmen year classes, but only 86% of those students graduated in four years. That leaves 713 non-graduates, despite passing every class their freshman year. These are not disengaged students, who do not show up to school or come in so far behind grade level that they are unable to pass any of their classes; instead, they are students who fall through the cracks at some point in their high school careers. Looking at other metrics, such as attendance, 8th grade performance and state tests, we see a similar pattern: students who show academic promise but are not able to follow through to graduation and college readiness. For these students, the failure is not that we have no way to reach them; it is that we do not apply ourselves systematically, intentionally, and consistently, and so we allow small problems to grow, eventually blocking students from graduating on time. Why do we see this variation, and how can a district support schools to make timely decisions on individual students? The school calendar lends itself to strategic data conversations at different points in the year. Student scheduling periods generally occur three times a year, as do Regents1 exam administrations; other key data are generated daily, such as attendance and student grades. But even at the scale of thousands of absences a day, capturing and organizing the relevant data—about the student as well as the school’s response—must happen quickly enough so that we can support our schools as they support our students. The key is a data management system that is automated, collaborative, and accessible to all users. 3. Project Scope and Timeline If the missed opportunities in a school result from a lack of intentionality, what exactly are the opportunities being missed? There are many key systems within a school that impact opportunities for 1 In New York, students need to pass five Regents exams—ELA, Math, Science, Global Studies, US History—to be eligible for graduation. These exams are generally taken during the first three years of high school. DATA MANAGEMENT 19 student growth, from budgeting to human capacity to parent engagement. Here, we will focus on one: student programming. ● Problem: Student programs are not necessarily driven by a student’s past performance and current needs, meaning that a student may not be scheduled for the combination of courses most likely to either help them get back on track or push them to their top level of performance. ● Desired Outcome: Every student is programmed for their highest-priority credits, whether that means getting back on track, preparing to retake an exam, taking advanced courses, or a combination thereof. Both the human system and data system need to act in concert for the desired outcome to be reached. The human system identifies which staff are involved, both at the school and at the district, and what their level of involvement is. For example, school aides may be involved in preliminary attendance issues, but responsibility for action may shift to attendance teachers or school leaders as a problem escalates. Similarly, district staff should have a window into the school’s data but should only intervene when the school needs additional support. In every case, the lines of responsibility and protocol for action triggered by the data should be clear. The focus in this report is on the accompanying data system that moves the human system into action. While the data needed to operate the different systems may vary, the aim of the data system remains the same: Provide immediate, accurate and comprehensive data to the right people, in the right form, at the right time, allowing them to make evidence-based decisions to seize every opportunity to improve students’ progress. There are several steps involved in meeting this objective, including processing the data and creating a tool accessible to users. First, however, it is essential to define the data scope of the project. Project leads should seek to answer the following questions: ● What is the opportunity for student progress? ● Who is involved, and what is their responsibility? ● What data do they need to be successful? DATA MANAGEMENT 20 ● When do they need it? ● In what form do they need it? ● How will they use it? ● How will we monitor the process and evaluate its success? In this student programming case study example, we defined the scope as follows: a. What is the opportunity for student progress? Each term, students are programmed for seven or more credit-bearing classes. The opportunity exists to recover any ground previously lost, as well as to push students to go beyond basic requirements and prepare for rigorous post-secondary work. Further, students can be programmed to match the student to the teachers and circumstances that will most likely lead to success. b. Who is involved, and what is their responsibility? At the school level, the responsibility for programming often falls to the guidance counselors and school programmers. However, the responsibility for optimally programming students is usually not explicitly defined, because the success of a student’s program relative to that student’s need or potential is rarely measured. Most often, counselors are responsible for determining students’ schedules, and programmers are responsible for implementing them. School leaders are generally not primarily involved with student programming; however, it is their responsibility to oversee the work of the counselors and programmers. The counselors’ determination of student need should be transparent and justifiable, as should be the decisions of the programmers on how to schedule students. As needed, school leaders should be ready to shift resources within a school to help programmers optimize student programs. The district can play an additional oversight role and directly support counselors and programmers when there is a capacity need at a school. Determining preliminary credit needs can be automated by the district and provided to each counselor, and student programs can be evaluated by the district with the purpose of identifying additional possibilities for improvement. Again, the transparency of the data and the decisions made from it allows all stakeholders to work together and ensure all programming decisions are made intentionally. DATA MANAGEMENT 23 next school year. From this year alone, however, we have shown that we can monitor progress and have an impact on the opportunities within a school for each of our students. For example, we believe our emphasis on retaking exams to achieve college readiness standards will lead our Class of 2014 to have a significantly higher percentage of college-ready graduates, as defined by CUNY, than all previous cohorts. In particular, we saw a 3.5 percentage point increase in students who meet the math standard in January of their fourth year as compared with the Class of 2013. As the math standard proves a larger barrier than ELA for our students, this will directly translate into hundreds of students going directly into credit-bearing classes during the freshman year of college, who a year ago would have been stuck in remediation. The most significant change to our approach, however, has been the focus on student programming. For the first time in February 2014, we put into action our belief that a district can play a crucial supportive role in something as granular, time-sensitive and school-idiosyncratic as student programming. a. Implementation On February 8 2014, we collected data on all of our students, including what courses they had passed in the Fall and what they were now programmed for in the Spring. In the week prior, we had examined each of the 172,000 course code combinations appearing on student transcripts and assigned them to a specific graduation requirement— a necessary piece of information that has not been automated in New York City information systems. We therefore saw what students needed and what they were programmed for. By the following Monday, February 10, New Visions staff were in schools reviewing the programming gaps and supporting programmers to reschedule students. Using spreadsheet-based tools specifically designed to support student programming, we were able to go through individual students in each of our schools and examine the potential misalignments. In some cases, what looked like a gap was the result of a student being too far behind to make up their ground fully in one semester, though they were making progress to meeting requirements; in other cases, the school’s code deck contained errors, and we were able to identify and correct them. After a week of intensive work in schools, we repulled the data from the source systems to discover if any changes had been made. DATA MANAGEMENT 24 b. Successes Because this was our first time building and rolling out the new data system, we had no baseline and little idea of what to expect. We knew some of our schools actively did their own reprogramming at the beginning of the semester, while others allowed the initial programs to persist. Across our four high school cohorts, we found 15,098 out of 37,4422 students were programmed to come up short of their ideal credit totals by the end of the semester. This included students who were optimally programmed but too far behind to close the gap within one semester; students in schools with incomplete or inaccurate code decks; and students whose programs had not been optimized, with our focus on this third group of students. By the time of our second data pull, 685 students with programming gaps had been reprogrammed to be on track; another 1,281 had been reprogrammed to make up more ground than their initial programs would have allowed. In total then, nearly 2,000 students had their programs improved over the course of that week, with the largest number of those coming from the Class of 2014. This success rate is understated because as the process of reviewing students’ historical course codes was potentially ambiguous, we made a conscious decision to be conservative and err on the side of false positives. If we incorrectly flagged a student, that flag could be resolved quickly by school staff; if we failed to flag somebody, however, that could lead to serious consequences for the student who missed the opportunity to get back on track. From a close examination of the data, however, we do know that at some schools, there were very real gaps that existed on February 8 that were closed by February 18. 2 Some of our schools were dropped from the analysis due to incomplete programs during the initial data pull. DATA MANAGEMENT 25 c. Challenges In our first effort to support school programming as the programs were being made, we can point to tangible changes in the schools and link those changes to the actions of our staff, empowered by our data. But we can also easily see how this process could have been improved, and we are using this reflection in preparation for next year. Better targeting Our first effort at reprogramming flagged numerous students as having programming gaps where none may have existed, due to anomalies in student transcripts and school sequencing. We therefore do not know the true denominator of programs that needed to be changed. Over the Spring semester, we have clarified our business rules with schools and will have fewer false positives going forward. From reactive to proactive Schools used existing disparate data systems to create the initial programs, after which we evaluated the programs. By moving to a more proactive strategy, we can support schools ahead of time by highlighting students’ highest need credits as soon as the previous term ends. We can even analyze interim data before the previous term ends to anticipate what the likely needs will be. DATA MANAGEMENT 28 IV. References Gordan, Robert; Thomas J. Kane and Douglas O. Staiger. 2006. “Identifying Effective Teachers Using Performance on the Job.” The Brookings Institute Discussion Paper 2006-01. Kane, Thomas J.; Jonah E. Rockoff and Douglas O. Staiger. 2006. “What Does Certification Tell Us About Teacher Effectiveness?: Evidence from New York City.” NBER Working Paper No. 12155. Cambridge, MA: National Bureau of Economic Research. Nye, Barbara; Spyros Konstantopoulos and Larry V. Hedges. 2004. “How Large Are Teacher Effects?” Educational Evaluation and Policy Analysis 26(3): 237-257. Palardy, Gregory J. and Russell W. Rumberger. 2008. “Teacher effectiveness in the first grade: The importance of background qualifications, attitudes, and instructional practices for student learning. Educational Evaluation and Policy Analysis, 30(2): 111-140. Rivkin, Steven G.; Eric A. Hanushek; and John F. Kain. 2005. “Teachers, Schools, and Academic Achievement.” Econometrica 73(2): 417-458. Skillern, Jane Wei; James E. Austin; Jerman Leonard and Howard Stevenson. 2007. Entrepreneurship in the Social Sector. Thousand Oaks, CA: Sage Publications, Inc. DATA MANAGEMENT 29 V. Appendices Appendix A A.1: AF’s Promotion Policies The following are AF’s mandatory and additional (non-mandatory) promotional criteria. Mandatory Promotion Criteria The school will consider a student who fails to meet ANY of the following criteria to be at risk of non- promotion. The principal has final authority to make promotion decisions based on a scholar’s readiness for the next grade. State and Other Test Scores (K-12) For Kindergarten – 2nd Grade Students: Below grade level on nationally normed reading assessment as determined by Achievement First In Grades 3 – 8: Score of 1 on any ELA, Math or Writing state test In Grades 9 – 12: In NY, score below 75% for ELA and Algebra and 65% for all other required Regents exams Attendance (K-12) 15 or more absences in a year (5 tardies and/or early dismissals count as one absence); there is no differentiation between excused and unexcused absences. Course Grades (5 – 8) Failing two or more of the following classes: math, reading, writing, history, and science Course Grades (9 – 12) Failing two or more core/required class (math, reading, writing, history, science, college readiness, required elective) after the summer academy session is over OR Being deficient two credits from any year of high school upon entering the grade. Summer Program Completion (9 – 12) Successful completion of an AF-approved summer program - Pre-College - Internship - Growth program - Summer Academy and/or SAT Boot Camp Additional Non-Promote Criteria: Kinder to 2nd Grade DATA MANAGEMENT 30 - The student scores below the 50th percentile or above on the TerraNova math exam - The student scores below the 50th percentile or above on the DRP reading assessment - The student scores 79 or lower on IA #5 in math - The student scores 69 or lower on IA #5 in reading Grades 3 – 8 - NY: Score of a “low 2” (defined by AF’s scaled score chart) on both NY state tests - CT: Score of a “low 2” (defined by AF’s scaled score chart) on two of the three (math, reading, writing) state tests - A student scored a 2 on the state test for two straight years in either math or reading (NY) or any of reading, writing, or math (CT) Interim Assessments Grades 3 – 8 - The student scored 59 or lower on IA #5 in MATH in the student’s first year at an AF school, 64 or lower in a student’s second year at an AF school, or 69 or lower in student’s third year at an AF school - The student scored 59 or lower on IA #5 in READING in the student’s first year at an AF school, 64 or lower in a student’s second year at an AF school, or 69 or lower in student’s third year at an AF school Final Grades Grades 3 – 8 - The student’s final grade in MATH class is lower than a 70 - The student’s final grade in READING class is lower than a 70 - The student’s final grade in WRITING class is lower than a 70 - The student’s final grade in HISTORY class is lower than a 70 - The student’s final grade in SCIENCE class is lower than a 70 Promotion for English Language Learners (ELLs): NY State law and CT State law require that we evaluate the promotion of English Language Learners differently. NY State - Grades K-7: ELLs who have been enrolled in school in the United States for 2 years or fewer are exempt from the ELA test. Instead, they must show satisfactory progress (move up one proficiency level in reading, writing, speaking, and listening) on the NYSESLAT. For mathematics, they must score a Level 2 in English or their native language. - Grade 8-12: ELLs who have been enrolled in school in the United States for less than 1 year have a 1 year exemption for the ELA test, but must show satisfactory progress on the NYSESLAT. For mathematics, they must score a Level 2 in English or their native language. - Grades K-8: ELLs who have been enrolled in the United States from 2-6 years must show satisfactory progress (defined above) in English as a Second Language on the NYSESLAT OR
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved