We’re reopening a discussion that was resolved years ago. When schools first started using the web, a prolonged debate followed regarding content filtering. On one side, we had to protect our children from the inappropriate content on the Internet while allowing them to take full advantage of the wealth of information available online. On the other side, whenever you’re “protecting” someone from information, some entity has to be responsible for making those decisions. In a democratic society with rights of free speech, deciding what someone else can or cannot access online is walking on very thin ice.
The debate was ultimately settled in the US by the Children’s Internet Protection Act in 2000. This law mandates filtered and monitored Internet access in schools for everyone receiving e-rate funding. Since schools rely on e-rate money to offset telecommunications costs, the debate was over.
In our district, Internet access is provided by our Information Technology Center, a consortium of school districts that provides technology services to its member districts. As part of the Internet access service, our ITC includes web filtering. This has made it very easy for us to ignore the problems inherent in filtering. Some sites are blocked that probably shouldn’t be. Others are not blocked and probably should be. But we don’t control the filtering. The ITC contracts with a company to provide that service, and we can’t override their decisions.
This has worked reasonably well for the last several years, but last spring, the filters were upgraded. The new filters are much more conservative than the old ones, and we’re finding that the filtering is getting in the way of our productivity. For example, if we go to Google and perform a web search, we get a list of search results. In the upper-left corner is the Google logo. Because this logo is on “images.google.com,” it gets blocked by the filter.
Other blocked sites are more disruptive. We use the Pinnacle gradebook system at the high school. Our support comes from our reseller, TNT Systems. Their web site is blocked. Two months ago, our special ed teachers went to a training session, only to find that the web resources they were trained to use are inaccessible at school. Our music department can no longer access the site they use to order sheet music. The list keeps getting longer.
To combat this, our ITC has configured a management account that allows our district to specify individual sites that can be blocked or unblocked to override the standard filter settings. So, if we want to, we can unblock these sites for our district, or block others that we don’t want our students to access.
This brings the responsibility for filtering back into the district. Now, we can’t just blindly say that it’s out of our hands. With the power comes the responsibility. It’s clear that we’re going to have to be a lot more specific in our policy about what we can and cannot access. Here’s what we currently have in policy:
The district will employ reasonable technology solutions, including Internet filtering software, to restrict access to objectionable material. Internet access by students will be monitored by staff or software at all times….
Network users must be aware that there are materials on the Internet that may be defamatory, abusive, obscene, profane, sexually oriented, threatening, racially or sexually offensive, or illegal. These materials include graphics, sounds, digital film clips, text, and other forms of media. Users of the Brecksville-Broadview Heights City Schools network agree not to use school district resources to download, copy, upload, transmit, forward, or display any such materials.
Network access is provided to support the curriculum and operations of our district. Any other use, including use of the network for personal, political, or commercial purposes is prohibited.
Our policy says that we won’t use the network for political or commercial purposes. Does that mean we should restrict access to these types of web sites? What about nudity? In many cases, it is sexually oriented, but that’s not necessarily always the case. What about sites that help students cheat in school? Are online games, or software download sites, or file sharing resources to be restricted?
Then, there is the category of sites I classify as “shooting the messenger.” We currently block access to web-based email. Why? Because we can’t control the content of email messages. Students could receive email that contains inappropriate material (or possibly sent it to themselves from home). This material could be accessible at school, so we block the tool. The same is true with online chat, messaging, online storage, and other sites that make the web interactive. Anonymizers and redirectors would help students defeat the filtering technology. As such, these sites are not illegal, and don’t violate the policy, but they would allow students to easily circumvent the security measures in place.
I’m trying to be more formal about how we should approach web filtering. The procedure for requesting an override will be specific and detailed. The decision to override the default filter settings will rest with more than one person, and those people will represent the various age levels of our students. I’ll certainly be soliciting the input of administrators, tech team members, media specialists, teachers, and you. But here’s where I’m starting:
General Filtering Philosophy:
- Internet filtering is designed to prevent students from mistakenly or accidentally accessing Web content that is inappropriate for school.
- Because of the changing nature of the Internet, no filtering solution can be 100% effective in blocking access to inappropriate content.
- Student supervision and student responsibility are key components of responsible Internet use.
- Decisions on blocking or not blocking individual web sites should be consistent with the web filter’s default settings unless there is overwhelming evidence that the decision is inappropriate or inapplicable for our schools.
- When there is indecision about whether a particular site should be blocked or not blocked, preference should be given to the more conservative practice of blocking the site.
- Does the site contain objectionable content, as defined by policy and guidelines?
- Is the site likely to contain objectionable content in the foreseeable future?
- Does the site make it possible to access objectionable content that would otherwise be inaccessible?
- Is the educational value of the content greater than the risk of inappropriate content being accessible?
This isn’t a challenge I’m looking forward to, but I think having the guidelines nailed down in advance will make it a manageable process.