In the proposed common standards research and development proposals developed by the US Department of Education’s Institute of Education Sciences (IES) and the National Science Foundation (NSF), two phrases are used to characterize research results: learning outcomes and educational outcomes. While these ways of assessing educational productivity are necessary, they are not, on their own, sufficient for covering the range of federal educational systemic investments. A proposed modification to these evidence standards adds the phrase systemic effects (or similar wording) to be used for areas where outcome or productivity measures are not optimal or appropriate. This modification would help with innovations—often technologies—that have theoretical connections to productivity, but where the evidentiary linkages can be complex and/or take many years to be manifest as well as policies areas that direct systemic behavior independent of productivity.
Systemic effects become important with technologies that operate at systemic scales. Because these tools can affect many parts of an organization or system at the same time, productivity measures that typically focus on transitional data points become weak indicators. A well-known example is distance education where research has shown dramatic growth in the industry, but modest improvements in learning (Means, et. al, 2010). The focus on productivity has obscured how these same innovations can impact teaching, organizations, and markets. This kind of information could be especially helpful as there is interest in applying these technologies into K-12 settings. Other technologies such as those from the federally funded State Longitudinal Data Systems (SLDS) program, the Learning Registry, and the National Instructional Materials Accessibility Standard (NIMAS) are all intended to operate across broad systemic spans (Piety, 2013). Many of these are motivated by expectations for systemic realignments where increased productivity is expected, but the main thrust is systemic change. Even new learning standards efforts—notably the common core and next generation science standards—can be seen as systemic technologies designed to change and realign activities that in the process may alter how productivity is measured and compared.
A focus on systemic effects would allow researchers to investigate more broadly the pathways of technology adoption without constraining their studies narrowly on production. As researchers studying organizations and technology studies have found, technologies can impact teams, markets and systems (Ciborra, 1993) in different ways and over different times. There are adoption and adaption processes that have been called at times diffusion (Rogers, 2010). Research focusing on individual or group productivity will likely miss important evolutionary processes and the lessons that can be learned from those processes.
In a number of areas, systemic effects can be directly tied to policy goals that are neutral in terms of productivity. For example, the Individuals with Disabilities Education Act (IDEA) specifies that students with disabilities should be taught in the ‘least restrictive environment’ (Public Law 108-446). There are also various regulations and laws that specify students should have access to instructional material and educational opportunities despite physical impairments. While access to these educational tools may result in greater learning and educational outcomes, comparable outcomes may also be achieved with lower costs and/or less efforts bone by other stakeholders. A focus on systemic effects would allow researchers to investigate these technologies and the complex ways they are used across various practice areas aligned to policy goals.
Developing a robust evidence base of systemic effects introduces opportunities and challenges. The opportunities include being able to “see” educational systems in action and draw together different kinds of evidentiary artifacts that can represent many ecosocial levels of activity from learning through administration (Bransford, et al., 2005; Lemke and Sabelli, 2008) at the same time. This systemic vision will likely be like other depictions of education: imprecise and influenced by context. However, by signaling its importance through language that categorizes educational research, IES and NSF will help to promote needed discussion in the field about what these effects can be and how they can be reliably measured.
There will also be challenges to moving beyond outcome measures. Measuring productivity and effectiveness can be done according to methods that have been endorsed in policy, national consensus publications including Scientific Research in Education (NRC, 2002), and supported by large professional communities that have agreed on basic tenants of what counts as evidence and rigorous analysis. Moving away from these paradigms, will entail the broad and loosely connected qualitative research paradigms where evidence standards become more diverse and rigor less clear. Design-based research (DBR) that conceptually aligns with new technologies even as it has largely been used in the study of classroom models (Cobb, Confrey, Lehrer, and Schauble, 2003; Dede, 2004; Sandoval and Bell, 2004) will likely become part of the conversation.
Systemic effects also is an area where there are opportunities to learn from and adapt methods that have been developed in other fields to study organizational change processes in other fields. Ultimately, systemic effects will lead to important questions of rigor and valuing a range of evidence types that vary not only in form, but in their ability to be used to make high-quality inferences as new work by Behrens, Mislevy, Piety, and DiCerbo (2013) explores. Focusing on systemic effects presents the field with opportunity to not only enlarge its own methodological tools, but to benefit from connections to other domains.
Behrens, J., Mislevy, R., Piety, P. and K. DiCerbo (2013). Inferential Foundations for Learning Analytics in the Digital Ocean. Draft Report to the Learning Analytics Work Group. H-Star Institute, Stanford University.
Bransford, J., Barron, B., Pea, R. D., Meltzoff, A., Kuhl, P., Bell, P., … & Sabelli, N. (2005). Foundations and opportunities for an interdisciplinary science of learning. The Cambridge handbook of the learning sciences, 39-77
Brynjolfsson, E., & Hitt, L. (2000). Beyond Computation: Information Technology, Organizational Transformation and Business Performance. The Journal of Economic Perspectives, 14(4), 23-48 and
Ciborra, C. (1993). Teams, Markets, and Systems: Business Innovation and Information Technology. Cambridge: Cambridge University Press.
Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational researcher, 32(1), 9-13.;
Dede, C. (2004). If Design-Based Research is the Answer, What is the Question? A Commentary in the JLS Special Issue on Design-Based Research. Journal of the Learning Sciences, 13(1), 105-114.
Lemke, J. L., & Sabelli, N. H. (2008). Complex systems and educational change: Towards a new research agenda. Educational Philosophy and Theory, 40(1), 118-129.
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Evaluation of Washington, D.C.
National Research Council (NRC). (2002). Scientific Research in Education. Washington, DC: National Academy Press.
Piety (2013) Assessing the Educational Data Movement. Teachers College Press New York, New York.
Public Law 108-446 (2004) The Individuals with Disabilities Education Improvement Act.
Rogers, E. M. (2010). Diffusion of innovations. Free press.
Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199-201.
Leave a comment
Jeanne Marie Iorio & Susan Matoba Adler’s (2013) commentary “Take a Number, Stand in Line, Better Yet, Be a Number Get Tracked: The Assault of Longitudinal Data Systems on Teaching and Learning” takes aim at a number of issues, including the growth in educational data in state databases and the unique numbers to identify and collect the records for students and teachers saying:
Statewide longitudinal databases are becoming sources for decision-making by policymakers, administrators, and teachers. These databases are tracking children and teachers, reducing the performance of children and the work of teachers to numbers. We call for an end to the obsession with the quantitative and hope for a rethinking of assessment and teaching practices that trust children and teachers as capable and critical to learning, teaching, and assessment.
The authors also question an organization called The Data Quality Campaign (DQC) linking the DQC to a litany of concerns about how collecting and using data is changing education. I agree with Iorio and Adler this is an important topic. In a new book titled “Assessing the Educational Data Movement” published by Teachers College Press (Piety, 2013) I explore this time of change in American education and look specifically at relationships between different organizations involved in promoting the use of educational data. I studied the DQC in some depth and highlighted its mission that, in its own words, “supports state policymakers and other key leaders to promote the development and effective use of statewide longitudinal data systems.” I featured the DQC for two reasons. First, it is historically significant that an organization focused on the quality of educational data would emerge during this time. The DQC began around 2005, which makes it a relative newcomer in a field where some communities have existed for over a century. Second, the kind of information that the DQC provides is unique and important for understanding how data is and can be used. Much of educational research has historically been regional. Much of it, rightly so, has focused on districts, schools, teaching, and students. This has left a gap in understanding what is occurring in and across the different states and the DQC’s work and publications provides important insights a growing national educational information infrastructure. Understanding the role of organizations playing a role in the formation of national policy should be fair game. Being fair, as scholars, about their work and intentions is equally so.
With over 100 partners—including the Council of Chief State School Officers, The American Association of School Administrators, Consortium for School Networking, and the American Federation of Teachers—the DQC is now a fixture in the conversation about data in American education. It has received support from large philanthropies, including the Bill and Melinda Gates Foundation, as well as strong support from the current and past Administrations and members of congress. It is a non-commercial entity with little connection to vendors. The DQC frequently hosts forums that bring together different points of view around data questions. I have attended over a dozen such events on topics that include data privacy and governance, using data to understand college and career-ready educational processes, and teacher effectiveness. I have never heard discussions about reducing any part of education to numbers. Rather there has been a deepening appreciation for the enormous challenges associated with making education a more information-driven field. A common phrase the DQC uses in its advocacy as been “data should be used as a flashlight not a hammer.” Recently the director of the DQC, Aimee Guidera (2012), wrote a commentary for Education Week titled “Moving Beyond the Single Data Point” that argued against releasing teachers names with value-added scores as had been done in New York and Los Angeles.
To be clear, I believe that asking questions about educational data is critical. The decisions made today in terms of infrastructures impact our children’s future and deserve a healthy debate. While there may not be technical limitations to including broad kinds of evidence in these systems, as Iorio and Adler suggest, those areas that are developed first will be well positioned later. The use of value-added models for math and literacy is directly related to those subjects being tested in response to NCLB. Whether one supports these measures or not, the linkage between the collection of certain kinds of data and its subsequent uses seems clear. National conversations about what kinds of data to include should include many voices, the DQC and others with similar views. Before scholars or advocates attempt to place their views on a moral high ground, they might also consider the validity of other perspectives. Some believe the collection of certain kinds of data distorts education and increases inequity. Others believe the lack of data obscures educational practice where inequity is embedded. Education is a vast, complex endeavor and both of these interpretations are supported by at least some evidence. How we frame these differences and treat other perspectives have implications for the kinds of relationships we have later on. If we want to make real progress in our field, it is important to engage in a thoughtful and inclusive debate. I respect, but do not always agree with, Iorio and Adler’s point of view when they say:
Leave a comment
Reduced to unique identifiers, children and teachers are positioned as people with no voice and no valid contribution to how education might be imposed upon each of them…[T]he teacher is explained through her teacher education program, her professional development experiences, and the performance of her students on a quantitative test… How can we reduce the complexity of the child and teacher to numbers? Why are we ignoring the richness of relationships between teacher and student to quantify performance of children and credentials of teachers? How can policy and administrative decisions be based on this small and limited quantitative facet of the people a school system serves? Have the ease of numbers and belief that quantitative work is not biased become an easy way to forgo thoughtful decision-making?/
Strong feelings about educational data are understandable because they can be so consequential for students and teachers. With each advance of technology comes concern about the changes they bring. It is hard to find a technology—from the printing press to steam engines to television—that has not brought apprehensions about what is slipping away. Students of Plato may recall his dialog “Phaedrus” written in 370BC where King Thamus debates with the Egyptian God Theuth who had given man the new technology of writing. Plato (Hackforth, 1972) posited that the youth would lose their powers of memory saying:
[F]or this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.
While a different time and technology, the tone is similar. While many appreciate the aesthetic of old cars, period movies, and the pastoral charm of simpler days, few people reading this would consider living without refrigeration, personal computers, or modern vehicles with Bluetooth capabilities. Progress and change happen. My book argues that the use of data is just one way in which technology is moving education forward, but that the ways it is improving education are not always clear. I argue that much is missing—specifically an understanding of educational practice in classrooms and schools—that can inform the currently oversimplified views of educational data.
In the modern era, the development of digital records has given rise to fears about people being reduced to numbers and overreach of “the state.” Each year, most adults have various forms of communication that include a social security number (SSN). When the Social Security Act was new, many people were similarly alarmed by the idea of ‘reducing people to numbers.’ Now, the SSN is prosaic and these numbers are usually assigned shortly after birth. If today the US Government entertained doing away with these common identifiers, many people would be alarmed and wonder instead how matters of public safety and finance could be managed without good record keeping?
In the 1960s, before optical character reading and electronic commerce, many checks and bills began to be printed on forms that could be read by the computers of that era. Many came with a statement that read “DO NOT FOLD, SPINDLE, OR MUTILATE.” (Spindling was a process of piercing paper documents so they could be strung together for filing). The machines used to read those documents would jam when the paper wasn’t just right. In those early days some protesters would deliberately fold, spindle, mutilate (and even staple) those documents to upset the machine of the modern state. Successive waves of newer technology rendered those concerns moot. While that phrase about folding, spindling, and mutilating is now part of our cultural history (Lubar, 1992), we can find at least two messages in it today that relate to what Ioro and Adler wrote. First, some of the immediate fears about new technology fade quickly with time and technological advances. The concerns about student and teacher identifiers may then also fade, although some of the issues that are implicated in data collection such as school accountability and evaluating teachers may not. The second has to do with how we represent others in our work. In advocating for our own point of view do we unwittingly fold, spindle, and mutilate the message and well intentioned work of others? I confronted this question in my book when discussing a number of organizations and projects that have attracted some polarizing views. Using principles of qualitative inquiry, I consulted with the organizations I wrote about, including the DQC, to give them an opportunity to help represent their work in what I wrote. Some may not agree with every part of this book, but will hopefully believe the approach is fair, encourages thoughtful discussion, and for these reasons has value.
The world of educational data presents challenges for study. As Coburn and Turner (2011) said “In many ways, the practice of data use is out ahead of research. Policy and interventions to promote data use far outstrip research studying the process, context, and consequences of these efforts.” The US Department of Education (2010) after a large study of this area stated, “The data systems themselves, the assessment data that populate district data systems, and school and district practices around data use are all changing rapidly.” Educational researchers have some catching up to do. Researchers from other fields may find educational data a challenging area for them as well. Education is unlike other fields, not only because it is a social practice, but it is social on many levels. These realities show up in the data; in its quality and consistency and in how difficult it is to use for comparisons. There are indeed many good and important questions about what kinds of artifacts are best to collect and how to integrate them into practices. As my book details, teachers and other practitioners have been far less represented in the design of educational data infrastructures than communities representing institutional interests. There is an opportunity for Iorio and Adler and those who agree with them.
One thing that Plato missed almost 2400 years ago and others are missing today is how new technologies work beyond the level of individuals. Historically, sociotechnical revolutions from the printing press to television to the Internet have changed professions and organizational structures. Tome and again, these kinds of revolutions can occur on many levels simultaneously, including individuals, teams, markets, and systems (Brynjolfsson, 1993; Cibbora, 1993). The history of writing (Harris, 1986) shows it had great impact on administrative structures and commerce. Not surprisingly, much of the emphasis around data in education has been on how individual practices can be assessed and changed, including with students, teachers, and schools. The organizational and professional changes are less considered. Blended learning school designs—new ways of structuring schools—are also emerging during this data movement. The emergence of the DQC is then not so surprising after all.
Organizations like the DQC, while they may not have all the answers, exist today as important parts of the national conversation whose work has represented the goals and interests of their stakeholders. Unfortunately, at the time of this commentary, the list of DQC partners does not include any colleges of education. As Gummer and Mandinach (2013) recently discuss, even though data matters for teachers in their work, there has been little attention to developing even basic literacy skills by colleges of education. There are fresh signs of progress, however as a number of leading universities are developing programs in learning analytics and data sciences and new associations, including an International Educational Data Mining Society and Society for Learning Analytics Research, have emerged with conferences and journals.
While it is popular to refer to other countries (ex: Finland, Singapore, etc.) in terms of educational models, in terms of data our situation is very American. Constitutionally, education is largely a state matter and the history of local control gives us a highly decentralized management of information when compared to other countries. At the upcoming American Educational Research Association (AERA) meeting in San Francisco I have organized a symposium titled “Big Data American Style.” Our discussant is the incoming president of AERA and the authors of papers include faculty from Stanford University and Teachers College, A Vice President for Pearson’s Center for Digital Data, Analytics & Adaptive Learning, The first Chief Privacy Officer at the United States Department of Education, and the DQC. The DQC’s paper is titled “The 4 Ts of State Data Systems: Turf, Trust, Technology, and Time: Policy Perspective on Empowering Education Stakeholders with Data.” For many in educational research, the DQC will bring a new and important perspective. Will it be completely representative of all of the interests in educational data? Probably not. None will. We come together in the American spirit of encouraging diverse points of view to help collectively inform our understanding. Practitioner oriented researchers are welcome to come and ask thoughtful questions and help campaign for high quality discussions about educational data.
Brynjolfsson, E. (1993). The productivity paradox of information technology. Communications of the ACM, 36(12) 66-77 .
Ciborra, C. (1993). Teams, markets and systems: Business innovation and information technology. Cambridge: Cambridge University Press.
Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research & Perspective, 9(4), 173-206.
Guidera, A (2012). “What’s in a Name? More Than a Single Data Point” Commentary in Ed Week August 6. 2012. http://www.edweek.org/ew/articles/2012/08/08/37guidera.h31.html.
Hackforth, R. (Ed.). (1972). Plato: Phaedrus. Cambridge University Press.
Harris, Roy. 1986. The Origin of Writing. London, England: Duckworth.
Iorio and Adler (2013). Take a Number, Stand in Line, Better Yet, Be a Number Get Tracked: The Assault of Longitudinal Data Systems on Teaching and Learning. Teachers College Record March 8, 2013.
Lubar, S. (1992). Do Not Fold, Spindle or Mutilate: A Cultural History of the Punch Card. Journal of American Culture, 15(4), 43-55.
Mandinach, E. B., & Gummer, E. S. (2013). A Systemic View of Implementing Data Literacy in Educator Preparation. Educational Researcher, 42(1), 30-37.
Piety, P. (2013). Assessing the Educational Data Movement. Teachers College Press. New York, NY.
Postman, N. (1993). Technopoly: The surrender of culture to technology. Vintage.
U.S. Department of Education, Office of Planning, Evaluation, and Policy Development (2010), Use of Education Data at the Local Level From Accountability to Instructional Improvement, Washington, DC
“Phil Piety’s book employs insights from the learning sciences to illuminate policies and practices for using information to improve American education. His analyses reveal deeply-held convictions by educators concerning uses of data and why some of the test-based policies of the educational data movement—including No Child Left Behind and value-added models for teacher evaluation—have turned out more challenging in practice than in theory. Piety highlights our need to understand the multi-layered social nature of education, recognize a number of fundamental characteristics of educational data, and to integrate design-based principles for enhancing the socio-technical activity we call schooling.”
—Roy Pea, David Jacks Professor of Education and Learning Sciences, Stanford University
“Unquestionably there has been a dramatic change in the collection and use of education data within a relatively short period of time. This critically important book highlights what constitutes the education data movement describing the vernacular of what every education researcher, practitioner, and policymaker needs to be aware of. Cautiously optimistic about the future, Piety points out the challenges educators will face as they struggle with their ever increasingly complex datasets and how they can be made useful for measuring learning, teacher quality, and organizational change.”
—Barbara Schneider is the John A. Hannah Chair and University Distinguished Professor in the College of Education at Michigan State University, and president of the American Educational Research Association, 2013-14.
“Everyone who wants to gain a better understanding of how data is transforming education should read this book. Piety’s analysis is comprehensive and covers every dimension of the American education system. He impressively connects the dots among the numerous institutions and actors that comprise the data movement. This book is a triumph.”
—Darrell West, Vice President and Director of the Center for Technology Innovation, Brookings Institution
“Piety brings a fresh perspective to ‘the educational data movement,’ situating its emergence historically, linking it to developments in various institutional fields, and framing it as a ‘sociotechnical revolution.’ Essential reading! Both proponents and opponents of the ‘data movement’ will learn from this book.”
—James P. Spillane, Spencer T. & Ann W. Olin Professor in Learning & Organizational Change, Northwestern University
For better or worse, many educational decisions that were once handled on a personal level by teachers or administrators now increasingly rely upon data and information. To be successful in this era, educators need to understand this broad sociotechnical revolution and how it is realigning traditional roles and responsibilities. In this book, the author draws on his unique background in learning sciences, education policy, and information systems to provide valuable insights for both policy and practice. The text discusses many current topics including technology-rich methods of teacher evaluation, big data and analytics, longitudinal data systems, open educational resources, blended and personalized learning models, and new designs for teaching.
This comprehensive book:
• Examines the social and historical context of the educational data movement as it unfolds across educational levels.
• Synthesizes different research traditions from inside and outside of education.
• Assesses the successes, challenges, and potential of data analytics.
• Helps educators and innovators design technology-rich solutions for greater student success.
• Discusses the catalytic role that foundations have played in making education a more informational and evidence-based practice.
Philip J. Piety is a national expert in educational data, founder of Ed Info Connections, a benefit corporation serving to improve the information educators’ use, and a faculty affiliate of Johns Hopkins University.
Excerpt from the book Assessing the Educational Data Movement to be published in April, 2013
One of the issues to emerge in policy discourse and from funders as the educational data movement was taking hold involved personalized learning. While education has seen pendulum swings around standardization versus personalization for decades, what is new is the idea of using data and information to drive individual attention to student needs. While other fields are routinely using datasets about specific customers, and others like those customers, to present more relevant options and services, the classical model is still largely focused on providing the same options to students irrespective of what information about those students might suggest. What has been done in other fields is to use data about individuals to help segment and divide a large market into smaller groups to which services can be targeted. Of course, students within a classroom and teachers within a school are already part of a small group. It is possible that there are others like those students or those teachers in different locations that that data can provide some opportunities to see what types of approaches and tools work well with different students and teachers that have similar characteristics. (more…)Leave a comment
Another excerpt from my forthcoming book: The Educational data movement: crossing boundaries, searching for student success
The classical model of teaching centers on the role of the teacher as manager of the classroom, conveyer and evaluator of knowledge. In this model, teachers direct everything that happens for all the students inside a classroom. Once the door is closed, teachers usually decide what order the information will be taught, which students will sit together or work together, and how to gauge and measure student understanding. The classical model of teaching is part of a traditional school design where all of the staff are arranged in a way that supports teachers in this classical role. Schools that have specialists—usually reading, math, and special education—use those specialists to augment the traditional classroom teachers. (more…)
This is an extract from one of the chapters of my upcoming book titled: The Educational Data Movement: Crossing Boundaries, Searching for Student Success.
One of the greatest challenges of our time—of research, of school leadership, and now of measurement—is to define teaching or what teaching should be. The educational data movement occurs at time when there is great uncertainty about what it means to teach. The types of changes that education is going through, as other fields have before, impact jobs and roles and organizational structures. During this time, there has been a quest to describe the job of teachers in ways that can be used to compare teaching with other types of professions. The definition of teaching is an important factor in how we consider teachers using data in their jobs. Is their work mechanical to transmit knowledge or are they knowledge workers and knowledge creators? These are the kinds of questions that highlight what makes education different from other fields. (more…)
Tags: #big data
Big data is a new term that can imply both new forms data and new analytic techniques. While big data is routine for many businesses and some sciences, it is new to education. Two characteristics of big data (in addition to lots of data) are that different kinds of information – some more structured than others – are used together and that the data focus is developing deep understandings of systems and context rather than only on outcomes. Organizations that leverage big data are often able to understand those they serve, and their environment better; to isolate and focus on their external and internal challenges and monitor their efforts. In American education, there is hope that big data tools can be a lever for change. (more…)Leave a comment
In recent months there has been increasing attention to the need for education technologists who can focus on the many large and not so large datasets that are proliferating in education. Some have called for exploration into the educational data sciences and asked about how to prepare this new type of professional. Many questions are emerging about this important area. What classes should they take? Can this be a college major or should it be a professional certification? How is this different from educational statistics? (more…)Leave a comment
Why should regular educational researchers, those working with science or social studies or emerging literacy pay attention to the changes in education around data. I can think of three reasons. (more…)Leave a comment
On May 17, 2012, political science professor Patrick McGuinn posted a paper titled: Fight Club Are advocacy organizations changing the politics of education? where he discussed a network of education reform advocacy organizations (ERAOs) that regularly meet to plan strategy for advancing their agendas that includes charter schools, alternative teacher preparation, and an emphasis on test scores. It is a fascinating account of a network of organizations that many people know little or nothing about and yet have been busy over the last several years with ambitious and in some ways aggressive advocacy. (more…)Leave a comment