MORE EVENTS
Leadership
Exchange
Solutions
Summit
DigCit
Connect
Change display time — Currently: Mountain Daylight Time (MDT) (Event time)

Keeping Students Safe: Matching Monitoring Applications Features to Educator and Student Perspectives

,
Colorado Convention Center, 108/10/12

Lecture presentation
Listen and learn: Research paper
Save to My Favorites

Research papers are a pairing of two 18 minute presentations followed by 18 minutes of Discussion led by a Discussant, with remaining time for Q & A.
This is presentation 1 of 2, scroll down to see more details.

Other presentations in this group:

Presenters

Photo
Researcher
NC State University
@RebekahSDavis
Dr. Rebekah Davis - Dr. Davis is a Research Associate in the Program Evaluation and Education Research (PEER) Group within the Friday Institute for Educational Innovation at NC State University. She has 25 years of experience in education in North Carolina at the elementary, middle, and undergraduate levels. Her experience teaching in high-needs schools for many of those years provides a pragmatic yet urgent view of the ways evaluation and research benefit students and teachers. The work she does helps build educators’ capacity to provide research-based, engaging instruction, especially in the areas of learning design and technology.
Photo
Professor
North Carolina State University
Dr. Florence Martin is a Professor in Learning Design and Technology at North Carolina State University. Dr. Martin engages in research to create transformative learning experiences through effective design and integration of digital teaching and learning innovations. In recent years, she has researched on the design of online learning environments and cybersecurity education with a focus on digital safety.She currently serves as a Senior Associate Editor for Online Learning Journal, AERA Division C Section Chair for Engineering and Computer Science, and on the advisory council for North Carolina Virtual Public Schools. More details can be found at https://florencemartin.wordpress.ncsu.edu/about-me/
Photo
Reseach Assistant
NCSU
Cigdem Meral serves as a Research Assistant at the William & Ida Friday Institute for Educational Innovation, North Carolina State University (NCSU). In 2023, she earned her doctoral degree from NCSU in Learning Design and Technology. Her dissertation focused on increasing diversity in STEM areas, specifically on underrepresented high school students. Additionally, she obtained a master's degree in Instructional Systems and Technology from Indiana University in 2017. Cigdem began her academic journey with a bachelor's degree in Elementary Education from Turkey. She is deeply passionate about promoting diversity in education.

Session description

The purpose of this paper is to explore the impact of monitoring applications on the end users, teachers and students, by considering what the apps are reported and intended to do in comparison to the perspectives of students and teachers as to what actually happened during the use.

Framework

The widespread use of online learning during the COVID-19 pandemic accelerated the use of monitoring applications (MAs) on devices teachers and students use for learning (Nichols & Monea, 2022). In the state of North Carolina, emergency relief funds were designated for use by schools and school systems to contract with third-party entities for technology to “facilitate mitigation of cyberbullying, monitoring of student internet activity, monitoring classroom educational devices, and assisting with suicide prevention services.” (PRC 192 & 193). Up to September 2022, over $5.6 million dollars were spent for MAs chosen for PRCs 192 (any MA) and 193 (used for Gaggle only). The most used MA was Gaggle 142/274 (51.8%), according to NCDPI records from school and system reporting as they requested funds. The only other app close to Gaggle in use was GoGuardian (80/274 or 29.2%). Others named in the 18 investigated by our team of researchers were Securely (14), LineWize (10), Bark (8), Lightspeed (8), Aristotle (4), Lanschool (4), Mosyle (1), and Impero (1).
This paper investigates the use of MAs by NC schools in response to the pandemic. Not only has there been a public debate about how MAs record, extract, and leverage the resulting information (which is largely situated in the fields of education, media studies, and public policy), but there also remains the visible, tangible impacts on teachers and students regarding the use of such apps. Did students and teachers perceive that the intended outcomes of mitigation of cyberbullying, monitoring of student internet activity, monitoring classroom educational devices, and assisting with suicide prevention services reached?
Framework
Livingstone & Stoilova (2021) identified four major categories of online risks for children: Content, contact, conduct, and contract. They were designed to guide policymakers and practitioners in their practical work and in communicating findings on online risk. While the fourth category was a relatively recent addition, the other three categories and the foundation of this model have been a “classic point of reference” (p. 5) in policymaking and analysis since the 2010s. One is UNICEF’s Global Kids Online, a project that has generated cross-national comparison data for children’s safety in over 35 countries.
The research question guiding this study is: How do the self-described functionalities of MAs match the intended use and the experiences of students and teachers as end users?

More [+]

Methods

The multi-method study collected and analyzed data from two sources for the above research question. The first data source was a selection of product websites for MAs, and the second data source was a group of online surveys (n=80) collected from students who attended public K-12 schools in North Carolina during the Pandemic. (The teacher survey is still open in an attempt to gather more responses, and should the teacher interview analysis conclude in time, we will include teacher interview insights.)
Monitoring Applications
Selection of the MAs and their associated websites was guided by a database provided by the North Carolina Department of Public Instruction (NCDPI), which listed MAs used by all school districts and charter schools in North Carolina during the Pandemic. As this was a self-reported and primarily open-ended survey, various technologies and platforms were listed. After the data was cleaned to eliminate redundancies and update information (i.e., platforms that no longer existed or were acquired by other services), exclusion criteria were developed for data reduction. A given platform or technology had to advertise the maintenance of a persistent student data record made available to the client to qualify as an MA. This boundary resulted in a list of 17 MAs.
Subsequently, the websites for these MAs were reviewed. Homepages, sub-pages, and any relevant attachments were analyzed and themed inductively, which resulted in three major code categories: risks to students, functionalities offered to address these risks, and the degree of integration that MAs had with existing technologies. The former two categories became the basis for the findings, with the risks to students later reorganized using Livingstone and Stoilova’s (2021) 4 Cs framework.
Survey
In the fall of 2022, the research team gathered publicly accessible information about MAs reported by schools and school systems as purchased or intended to be purchased with the ARP ESSER funds under PRC 192 and 193. (there will be an appendix with the complete list). This information was then used to help build the survey and distribution procedures, which included identifying means for recruiting participation of students over the age of 18 and teachers from schools and counties using MAs in North Carolina from March 2020 until the summer of 2023.
IRB approval was obtained to send surveys to Technology Facilitators, Technology Directors, Teachers, and Students over 18 with the assistance of NCDPI and educational institutions such as universities where students from school systems would likely be enrolled after their public school experience. The decision to survey only students over 18 was intended to gather information from students who may be more aware of device monitoring due to their age and maturity and to reduce the number of steps to access the participants because they had reached the age of adulthood. The research team avoided spreading the surveys by social media due to the incentive(s) offered for participation and scamming, which the interview team had already experienced in the process of gathering interviews. As a result, fewer responses were reached than desired, but we are confident that the survey was not packed with invalid responses.
The survey began with a consent form explaining that participation was voluntary and that they would answer a couple of questions to ensure they fit the target audience: teachers and students at over 120 schools and school systems that had applied for and/or obtained funds to use MAs with PRC 192 and 193. There were three initial questions to help identify their qualifications to take the survey, then a total of 14 questions about their use of technology at school, their notification of use and awareness, and their level of concern. Some of the questions would be skipped or shown based on previous answers, and none of the answers were forced, so there are varying numbers of responses to each question. The percentages shared are based on the number of answers received for that question.
The open-ended questions were designed to determine if there were any instances of cyberbullying resolution or suicide prevention, as that was part of the stated intent of the funding for MAs. (we will include a list of questions in the appendix)

More [+]

Results

MA Review
Risks
As these websites were the public face for these various MAs, they described various opportunities to improve educational systems, which were often communicated as risks to students. This section outlines the problems that MAs sought to address using their products and services. They have been organized using Livingstone and Stoilova’s (2021) four online risks for children: Content, contact, conduct, and contract.
Content is concerned with online media, specifically how mass-produced media can expose children to harm such as violence, pornography, hate speech, or commercial manipulation. This was the central focus for many MAs, as a majority (n=13) argued that students can access online material that is inappropriate for their age, maturity level, or status as minors. In a more practical and pedagogical context, a majority (n=10) also mentioned that unmitigated access to devices could distract students, harming their classroom productivity (e.g., browsing the Internet or playing with unrelated apps).
Contact refers to children participating in “adult-initiated online activity” (p. 5). This contact underlies two distinct but interrelated risks identified among MAs: Cyber security and cyber safety. Cyber security (n=12) is the secure transfer and storage of information that can be compromised by outside actors (e.g., malware, phishing, password leaks, social engineering, etc.), was a prevalent concern. MAs argued that students could divulge valuable or private information by talking to strangers online, clicking unfamiliar links, or opening unknown attachments. In turn, this could put them, their families, their devices, or others in the school network at risk. Cyber safety (n=9) is concerned with the physical aspect of that risk. The above cyber security vulnerabilities could imperil students’ well-being through predation, stalking, or abduction.
Conduct refers to the risks from exchanges between children, either as perpetrators or victims. MAs argued that cyberbullying (n=7) and school violence (n=6) were two notable conduct risks. Cyberbullying, the act of bullying in online spaces, has been a concern large enough to receive attention in CIPA (Senate, Congress, 2007); this policy’s influence could be found in the MAs that directly cited CIPA certification or argued for cyberbullying being a national issue. As cyberbullying can happen using any peer-to-peer communication (e.g., texting, email, etc.), which abound in online classroom environments, it was commonly screened for across the functionalities.
School violence, which refers to any violent act to self or others, is an inherently physical occurrence rather than a virtual one. However, some MAs portrayed it as a material concern with online precursors, arguing that students intending harm can also demonstrate changes in behavior beforehand. Therefore, These behavioral changes could appear online (e.g., threatening messages, withdrawal from class, visiting websites that promote violence, etc.) within the purview of MAs.
The final risk factor, contract, was added to the original three factors in 2018 as a response to the increasing datafication of students. It refers to “when children use digital services as well as when they are impacted by digital transactions conducted by others in other ways” (p. 7). No MA directly addressed this risk, possibly because their own contracts could fall under this critical analysis. However, many expressed commitment to data security and privacy within their own systems by displaying certifications and endorsements (e.g., iKeepSafe, CIPA, etc.) to assure that they could be trusted as “recognized leaders in protecting student privacy” (GoGuardian, 2023).
Livingstone & Stoilova also acknowledged “cross-cutting” (p. 8) risks that did not neatly fit into the four categories. They proposed student mental health as an example, which was also a concern among several MAs (n=8). These MAs argued that students needed more personalized and thorough mental health support and monitoring, which is difficult to attain through traditional surveillance and reporting infrastructures. A singular but notable cross-cutting risk was presented by Gaggle: LGBTQ+ students are a specifically vulnerable group for mental health issues and being targets of cyberbullying.
Functionalities
Although all participated in datafication, the goods and services exhibited by the seventeen MAs exhibited a range of tones and priorities. Gaggle seemed to put student mental health and safety at the forefront, at the top of their homepage, stating that “95% of district partners believe Gaggle identified students who no one knew were depressed” (Gaggle, 2023). Several other MAs used their homepages to pitch their full suite of solutions that would “support the whole student” (Securly, 2023) or create online environments “where every student can thrive” (GoGuardian, 2023). Bark centered on “parental controls that build trust” (Bark, 2023), but offered opportunities for school systems as well. Some, like Hāpara and Sown to Grow, emphasized curricular management and social and emotional learning (SEL), while others, like Pearl Echo.Suite and Mosyle focused on network security (e.g., security updates, remote device management, firewalls, compliance, etc.). Despite these differences, seven themes were identified across these various purveyances.
Behavioral analytics was the most common functionality among MAs (n=14). Sometimes called “dashboards” (Sown to Grow, 2023) or “insights” (Lightspeed, 2023; Linewize, 2023), they are a means of accessing descriptions of students’ online behavior. For many MAs, these analytics were interactive; users could manipulate and visualize data according to their needs (Hāpara, 2023; Panorama Education, 2023). Analytics can be in macro, such as system-wide trends in website traffic (Lightspeed Systems, 2023), or micro in the form of student behavioral reports (GoGuardian, 2023; Sergeant Labs, 2023). MAs often paired their analytics with case management so that administrators could “track each student case from inception through completion” (Securly, 2023), all while continually gathering student data and gaining new insights. This was often the central functionality offered by MAs, but how such analytics were carried out and communicated relied on other functionalities.
Echoing CIPA’s goal to protecting children from inappropriate online media, a majority (n=13) of applications offered some kind of content filtering. Many employed traditional firewalls that prevented devices or networks from accessing blacklisted internet addresses; this, in addition to logging web activity, was essentially the only function of Pearl Echo.Suite, an MA in this study (Pearl Software, 2023). However, some MAs used ostensibly more delicate methods, such as using ML to scan a device’s screen in real-time and block specific media (e.g., text, images, embedded video, etc.) while letting students access the site (Lightspeed Systems, 2023; Managed Methods, 2023). As students can take school devices home or conduct schoolwork on their personal devices, blocking can be contingent on location or time range. Two MAs also offered time limits on content (Bark, 2023; Pearl Software, 2023).
Third-party integration was a widely advertised functionality (n=13). Many MAs assured that their proprietary technology worked to “fit into [a school’s] specific learning environment” (GoGuardian, 2023), citing various recognized technology brands. Third parties mostly included learning management systems like Canvas (Gaggle, 2023) and cloud-based platforms like Google Workspaces and Microsoft 365 (GoGuardian, 2023). Some MAs worked on the operating system level for mass distribution across Chromebooks (Hāpara, 2023; NetSupport School, 2023) or Apple ecosystems (Mosyle, 2023). Lanschool (2023) partnered with global technology company Lenovo to conduct content filtration services. In one notable instance of vertical integration, Bark (2023) sold Samsung phones with their proprietary monitoring software pre-installed.
MAs advertised acquiring data in two different ways. Recording student activity was the more common but less direct approach (n=11). Using various hardware and software, MAs accrued a range of student behavioral data, including search engine submissions, web traffic, emails, instant messages, geolocations, and webcam footage. Depending on an MA’s device permissions, data could be intercepted within a single application, suite of applications, or at the operating system level. Because text data can be used for various purposes, including web navigation and peer-to-peer communication, it seemed to be the most common data type to be passively collected in this manner. Most MAs offering behavioral analytics also recorded student activity, but four did not (Aperture, 2023; Hāpara, 2023; Panorama Education, 2023; Sown to Grow, 2023), as they used assessment data instead.
Student assessment, the more direct process of measuring student performance and feedback using surveys and assignments, was the other means of collecting data. Some MAs (n=7) used combinations of active, passive, formative, and summative assessments, often in conjunction with curriculum management platforms. These were used to build profiles on students’ emotional states, class performance, and reflections on schoolwork, among other insights. Many of the MAs that offered assessments did so from a social and emotional learning (SEL) perspective (Aperture Education, 2023; Panorama Education, 2023; Sown to Grow, 2023). These SEL assessments were check-ins with students to gauge their overall well-being and feelings about their coursework. Some MAs used recorded activity as well as student assessments to develop more holistic student profiles (Linewize, 2023; NetSupport School, 2023; Securly, 2023).
Machine learning (ML) was an essential means for a majority (n=10) of MAs to carry out their solutions. Broadly speaking, ML is the use of statistical methods and computer science to design, train, and test algorithms that can make classifications and predictions from typically large data sets (Akgun & Greenhow, 2022). It is an essential component of artificial intelligence, a term that many MAs also used to advertise themselves. For many MAs, ML drove other functionalities such as providing indexes for student performance on a dashboard (Sown to Grow, 2023), scanning and blocking unapproved screen content (Lightspeed Systems, 2023), and notifying administrators of critical response situations (Gaggle, 2023). Despite its broad applications, ML was not used by all MAs. For example, Panorama Education offered behavioral analytics via “insightful dashboards featuring unified academic, behavior, and [social-emotional learning] data for each student” (2023, para. 2) without advertising the use of ML. Instead, they merged data streams from their assessment tools and existing school infrastructures into a single interface for ease of use and reporting.
The presence of an escalation protocol, while pitched by a minority of the MAs (n=7), was found among five of the more popular MAs and a salient interaction between the three datafication processes. Escalation protocols ensure that stakeholders–generally administrators and mental health counselors, in this case–receive notifications about especially high-profile student activity. Commonly cited examples of this included suicidal ideation and threats of violence to specific students or school campuses. MAs used escalation protocols to identify emergencies from large, noisy data and push them up the chain of command for more localized attention. Since all MAs were in the business of communicating student behavior to their clients, the presence of an escalation protocol was difficult to define. Ultimately, a key indicator was that these MAs ensured 24/7 support, notifying “designated responders” (GoGuardian, 2023) to decide how to proceed at any given time. Notably, Gaggle (2023) showcased their ability to contact students’ homes directly via emergency response services and trained tele-therapists, ostensibly cutting out intermediaries and saving precious time in the case of a life-threatening event. However, this was only available as an optional service, and in general, MAs offering any kind of around-the-clock support did so at an additional cost.
Survey
Survey participants were provided the name(s) of the MA intended for use by their schools as we suspected users, especially the students, would be unaware of either the existence of the MA or the fact that that particular app, like Gaggle, for example, was considered a monitoring application and not simply an ed tech tool. The opening to the survey contained the definitions for monitoring application (any software used to mitigate cyberbullying, monitor student internet activity and classroom educational devices, or assist with suicide prevention services) and incident (an instance of digital activity being flagged (or otherwise marked) and notification of school staff, parents/caregivers, and/or other entities, such as law enforcement, social services, etc.).
In our student survey with the high school students, we found a diverse age and demographic distribution. As a reminder, no answer was forced, and the percentages reflect the number of answers received on that question. A majority, 60%, were born in 2004, while 33% were born in 2003. Regarding gender, 54% identified as female, and 40% identified as male (The remaining answers possible were nonbinary, prefer not to answer, and other.) When examining ethnicity, 75% were white, followed by 8% Asian, 5% Black or African American, 8% Hispanic or Latino, and 3% identified as multiracial. Furthermore, amidst the pandemic years of 2020-2022, 31% of the respondents were in the 11th grade, 32% in the 12th grade, and 26% had already progressed beyond high school. (They were asked to check all that applied).

Finding: There is a potential lack of messaging to users about MA and a lack of understanding.
The preliminary survey results with the teachers provided insight into the stakeholders’ involvement during the introduction or implementation of the MAs. A mere 13% of the respondents indicated they were offered an option to opt out of MA use. 49% stated that they were neither notified nor consulted, while 38% reported being simply informed. Additionally, teachers shared feedback regarding students’ involvement and their awareness. According to the responses, only 3% of the teachers believe students were presented with an opt-out option. While 43% believed students had been notified that they were being monitored, a significant 54% reported that students were neither informed nor consulted.
When students were queried about the same matter, when reflecting on their own involvement, a majority, 70%, of students acknowledged being notified about the initiative. However, only a fifth, 20%, felt they were given an actual choice to participate. An even smaller fraction, 10%, said they were actively approached for their input. Moreover, students believed that 58% of their parents or caregivers were simply informed. Interestingly, a quarter, 25%, of the students felt that their parents or caregivers were extended a choice to participate. Additionally, 17% sensed that these adults were asked for their perspectives.
(the final paper will include tables for reference)
The majority of students (59%) did not have specific feedback, thoughts, or recommendations about the MAs, as indicated by their "none, N/A, or no" responses. This could imply either indifference or a lack of awareness about the extent of cyber monitoring by the school or district. It's worth noting that another 6% were unaware of the monitoring altogether. This raises concerns about transparency and students' right to know about the extent of their online surveillance. Based on our initial teacher survey, which focused on open-ended responses, approximately 6% of teachers emphasized the need for improved communication regarding the Monitoring Applications (MA). They believe that enhancing clarity about the MA's purpose and functions will safeguard students' rights and empower both students and their parents with better knowledge about the tool.

Finding: There is a variation of school procedures and MA administration.
Certain teachers may have more extensive access to student data than their peers. While some schools restrict their monitoring to school hours and only on school-provided devices, others monitor student data continuously, regardless of the time or device used. Out of the teachers surveyed, only eight indicated that they had received training on monitoring applications. Conversely, 18 teachers confirmed they hadn't undergone any such training. The remaining teachers offered no specific information or feedback on the subject. Of the teachers who responded to the survey, 15 expressed interest in receiving MA training, while three did not. Notably, none of the teachers who declined the training have previously undergone it. (Another reminder: the teacher survey results are preliminary. The survey will be closed in the fall of 2023.)
Finding: The perceived level of success with the objectives of preventing harm and restricting inappropriate content was low.
Teachers were asked about their perspective on the efficacy of monitoring applications in schools with a specific focus on two primary concerns: preventing self-harm and restricting access to inappropriate content. When it comes to the ability of these applications to prevent self-harm, the result revealed mixed sentiments. A limited 3% of the teachers believed the system worked “extremely well.” While no teachers rated it as working “very well,” a minor 6% felt it functioned “moderately well.” The largest segment, combining “slightly well” at 28% and “not well at all” at 35%, suggests a skeptical attitude towards the efficacy of these applications.
The second concern of barring students from inappropriate content, the outlook appeared more optimistic, though still varied. Notably, none of the teachers believed the tools functioned "extremely well." However, 9% attested they worked "very well." A significant 36% expressed the opinion that the applications operated "moderately well." The "slightly well" and "not well at all" categories garnered 21% and 17%, respectively, indicating some doubts still linger. Equally telling was that 17% of the educators remained unsure, marking "I don't know."
Students were also asked about their views on the performance of monitoring applications within their educational environment. For the vital task of preventing cyberbullying, the reviews were mixed. A significant 31% felt the applications did "not work well at all," while 32% believed they functioned "slightly well." A quarter, at 24%, deemed them "moderately well," and a combined 13% rated their efficacy as either "very" or "extremely well." The applications' capability to prevent self-harm received more critical feedback. A concerning 39% of students felt they didn't work well at all in this area. Close to 30% thought they worked "slightly well," while 18% found them "moderately well." A combined 14% expressed confidence in the tools, ranking them "very" to "extremely well."
The most positive feedback came in this category. Only 13% felt the applications did "not work well at all." In contrast, 28% rated them as "slightly well," 23% "moderately well," and an encouraging 38% gave a combined "very" to "extremely well" rating. The majority, at a staggering 87%, either said "No" or "N/A," indicating they didn't see the apps as aids to resolve issues. A minor 6.8% felt the apps inadvertently blocked educational resources. Another 2% felt the applications were oblivious to signs of self-harm, and a mere 2% found them beneficial.
(The final paper will include charts or tables.)
Finding: Very few instances of helpfulness were reported.
Out of the teachers surveyed, 11 reported that there were no incidents related to monitoring applications in a year. Among the preliminary findings from 51 teachers, 16 stated they were uncertain about the number of times monitoring applications played a role in addressing an incident. The remaining teachers provided varied figures, with reported incidents ranging from as few as 2 to as many as 100 incidents in a year.
When students were surveyed about the efficacy of monitoring applications in addressing incidents related to self-harm — whether concerning themselves or a friend — their responses painted a distinct picture. Almost half, 49%, stated that the applications did not provide assistance related to self-harm by answering “No.” Close behind, 46% admitted to being unaware of the application's potential in such situations. The remaining feedback diverged into various perspectives. Some found the tools helpful in certain situations, while others felt the applications merely filtered out specific words relevant to classroom tasks. A few even indicated that they sidestepped the issue altogether by not using the school computer.
In the survey teachers were asked about the real-world impacts of the monitoring applications, a specific question stood out: were they aware of instances where these tools aided in resolving bullying situations or provided help in cases of self-harm? The majority, 67% added that they did not have any such examples. Additionally, 6% of the teachers expressed uncertainty, indicating they did not know.
However, one insight came from a teacher who shared their experience with an alert system that identified potential self-harm instances. Because of timely alerts, they were able to promptly address these situations by meeting the students and communicating with their parents. Interestingly, they noted minimal instances of bullying detected through their monitoring tool. Another example was shared where a student's email, flagged for suicidal intent, triggered a rapid response, with the teacher receiving a notification in a mere 15 minutes.

More [+]

Importance

Monitoring Applications (MA) use and implementation in educational settings appear to be highly varied, as revealed by our preliminary teacher survey result. One of the most striking observations is the disparity in monitoring practices across different schools and among different applications. While some schools restrict their surveillance to just school hours, others opt for continuous monitoring regardless of the time. This inconsistency raises crucial questions about the primary objectives of these monitoring systems, where students often utilize school devices outside of regular school hours and in their homes. For instance, a student using a school-issued device at home might still be under surveillance if the school's MA policy is set to "all the time" monitoring. In contrast, other schools might have their monitoring confined strictly to school hours, ensuring student privacy when they take the device home.
Furthermore, personal devices present their own set of challenges. While some schools strictly monitor only school-issued devices, leaving personal devices unmonitored, this decision might unintentionally create loopholes for students to bypass institutional safeguards. Feedback from the teacher survey underscores this concern. One teacher highlighted, through an open-ended response, the practical implications of such an approach: students were accessing content deemed inappropriate or restricted when using their personal devices, effectively circumventing the monitoring applications set up on school devices. This suggests that while the intention behind not monitoring personal devices may be to respect student privacy, it could inadvertently grant students unhindered access to potentially harmful or distracting online content.

Additionally, such variability in monitoring practices underscores the importance of clear communication between schools, students, and parents. It's essential for all stakeholders to understand the extent, objectives, and limitations of any monitoring system in place. Only with such transparency can a balance be achieved between ensuring student safety and respecting their privacy rights. The data highlights a growing concern within the educational sector about the utilization of monitoring applications and the awareness levels of both teachers and students regarding their use in schools. Almost half of the teachers, precisely 49%, revealed they weren't informed or consulted about the application's deployment, raising potential transparency concerns. In contrast, the students' responses offer a different perspective. A significant 70% confirmed they were informed about being monitored via the school system. Encouragingly, 20% even mentioned they were provided an option to opt out, suggesting a level of autonomy granted to the students. However, when it comes to the teachers' own involvement, the situation appears more restrictive. A mere 13% indicated they were offered an opt-out choice. Furthermore, their perception of stakeholder engagement for parents or caregivers and students was even less optimistic. According to the teachers, only 5% of parents or caregivers and a scant 3% of students were granted a choice to opt out. This juxtaposition of responses underscores the varying degrees of awareness and choice experienced by different stakeholders within the school system.
Ultimately, the challenge lies in striking the right balance. Schools need to protect and guide students in the digital realm while also ensuring they don't overstep boundaries, infringing on personal rights and freedoms. The diversity in MA implementation, as highlighted by the survey, showcases the need for a more standardized yet flexible approach accompanied by open communication channels among all stakeholders.

More [+]

References

This list will be longer in the final paper with a finalized discussion.
Aperture Education. (2023). Aperture Education. Social and Emotional Learning - Aperture Education. https://apertureed.com/
Aristotle. (2023). Classroom Management Software for K-12 Educators, AristotleK12. Sergeant Laboratories. https://www.sgtlabs.com/aristotlek12/
Bark. (2023). Bark—Parental Controls for Families. Bark. https://www.bark.us/
Gaggle. (2023). Gaggle | K-12 Online Safety Management Software. Gaggle. https://www.gaggle.net
GoGuardian. (2023). GoGuardian | Engaging Digital Learning for Schools. Goguardian. https://www.goguardian.com/
Hāpara. (2023). Classroom Management Software for Schools. Hāpara. https://hapara.com/
Impero. (2023). Impero Software Solutions. https://www.imperosoftware.com/us/
Lanschool. (2023). LanSchool Teaching and Classroom Management Software For K-12 Schools. https://lanschool.com/
Lightspeed Systems. (2023). Lightspeed Systems®: Leaders in Online Safety & Education Solutions. Lightspeed Systems. https://www.lightspeedsystems.com/
Linewize. (2023). Content Filtering and Cyber Safety for Schools | Linewize. https://www.linewize.com
Livingstone, S., & Stoilova, M. (2021). The 4Cs: Classifying Online Risk to Children. CO:RE Short Report Series on Key Topics. https://doi.org/10.21241/SSOAR.71817
ManagedMethods. (2023). ManagedMethods Cybersecurity, Safety & Compliance for K-12. ManagedMethods. https://managedmethods.com/
Mosyle. (2023). Unified Apple MDM & Security—Mosyle Manager. Mosyle. https://school.mosyle.com/
NetSupport School. (2023). NetSupport School—Classroom Management Software. https://www.netsupportschool.com/
Panorama Education. (2023). Panorama Education | Supporting Student Success. https://www.panoramaed.com
Pearl Software. (2023). Web Filtering & Cybersecurity Software—Pearl Software. Pearl Software. https://www.pearlsoftware.com/products/pearlEcho/
Securly. (2023). Securly—The Student Safety Company. Securly. https://www.securly.com/
Senate, Congress. (2007). S. 49 (IS)—Protecting Children in the 21st Century Act. U.S. Government Printing Office.
Sergeant Labs. (2023). Classroom Management Software for K-12 Educators, AristotleK12. Sergeant Laboratories. https://www.sgtlabs.com/aristotlek12/
Sown to Grow. (2023). Sown To Grow. Sown to Grow. http://www.sowntogrow.com

More [+]

Session specifications

Topic:
Safety, security & student data privacy
Grade level:
PK-12
Audience:
Chief technology officers/superintendents/school board members, Principals/head teachers, Technology coordinators/facilitators
Attendee devices:
Devices not needed
ISTE Standards:
For Education Leaders:
Visionary Planner
  • Evaluate progress on the strategic plan, make course corrections, measure impact and scale effective approaches for using technology to transform learning.
Systems Designer
  • Protect privacy and security by ensuring that students and staff observe effective privacy and data management policies.
  • Establish partnerships that support the strategic vision, achieve learning priorities and improve operations.