Thursday, July 26, 2012

Communications and the flood of information

We live with too much information. Perhaps work rather than live - the individual can choose what their information landscape looks like outside of work. Here's something of a taxonomy (in our school; likely somewhat similar in many schools):

  • email (access anywhere, anytime)
  • online calendars
  • portals (e.g. compass)
  • intranet/websites
  • social media
  • files on a computer
  • cloud based files/applications
  • smartphone apps
  • tablet apps
  • paper-based chronicles
  • books
Compare to 5 years ago:
  • email (access at laptop on desk, probably wired connection)
  • paper-based chronicles
  • books
  • files on a computer
  • intranet/websites
  • social media
Compare to 10 years ago:
  • email (access via PC)
  • paper-based chronicles
  • books
  • files on a computer
As the process has been quite gradual, there hasn't really been any discussion on the implications of this, nor training on how to manage the flood. This is certainly a workload issue, but for many the situation exists outside of work, or the demarcation of work blurs (this is an issue in itself). And it's not only the implications and training of staff - this applies as much to students.

A major concern here is the signal-to-noise ratio. The noise is both getting more frequent (more communications occuring), and stronger (every communication fighting to stand out). So we end up missing things, and the cycle feeds back into itself. Finding the signal in the noise takes time and energy. There are two possible ways forward: decrease the noise, or filter the noise. The former comes from training, etiquette, and careful selection of systems (perhaps cutting out systems). The latter - training people how to use systems more effectively, and employing smarter/better suited systems.

A (perhaps more important concern) is determining if the flood of information is worth it. And if it's not, whether it can be stopped (you can't stop progress).

This issue will be the focus of discussion for much of the coming year.

Continuous and Connected

On paper, data has to follow slow cycles. Computers deprecate such cycles, but our workflows seem stuck in a paper-based ideology. Here's what's possible now:

Continuous Collection

There is simply no need to have a reporting cycle at the end of each semester, when that data can be continuously gathered throughout the semester.

Continuous Analysis

Then, the data can be continuously looked at, trends noted, students and teachers who are falling behind given a hand. The concept of reporting could almost be dropped. Parents could also be given the data on a continuous basis, but that opens up a separate can of worms.

Multiple Sources

Data is connected. Some systems may try to become the One True Data Source(TM), but in reality they fail, and become just another data source in the equation. Data analysis has to account for this. I don't know how well analysis services can query disparate DB systems - for the moment I'm figuring some synchronisation will be the easiest way to achieve this (i.e. creating One True Data Source(TM) from a number of external ones).

The (newly developed) school reporting package we've just moved to doesn't account for this. It can handle the continuous nature of data collection, but less so the continuous nature of analysis, and barely at all multiple data sources. Those latter two can probably be hacked onto it using some SQL, but it's a shame the analysis and import features of the product itself look to the past more than to the future.

On the other hand, if everything is continuously continuous, we lose milestones, a sense of accomplishment, of finishing something. Our minds may adapt to this in time (facebook status vs a letter to a dear friend, web snippets vs a book, youtube vs a feature movie), but for now there still is probably some need to have clear end points for some of this (i.e. end of semester reports). But that can co-exist with the continuous collection and analysis.

Happiness


How can one measure student happiness? Or anyone's happiness, for that matter.

Yearly or bi-yearly surveys do a decent job of this, but it would be interesting to see an ongoing measure of this during the year, and correlating against other events (start of year, holidays, exam time, even particular days of the week).

On first glance, a simple daily (or weekly) user rating might seem a valid solution. At the start or end of each day/week/class/<insert other time period here> the student clicks a thumbs up or thumbs down (perhaps a 'meh' option there as well). A very blunt instrument, but a starting point nonetheless. Comparisons between students might not be valid, as there are probably different circumstances and interpretations of happiness. But perhaps the more difficult problem is authenticity of the data. Someone writing in their own diary might bare their soul as there is no audience - no-one to witness them in a weak, exposed state. But as soon as there is an audience, things change. A depressed student might not want others to know of this, someone with some problems at home might be fearful of consequences, someone feeling antisocial probably won't want the hassle of someone trying to 'help.' When asked to rate their happiness every morning, students have every right to ask 'why?' or 'who wants to know?' And with good reason - this is personal data, that could be used for evil.

So trust is a huge factor here. The student would need to trust the school, the survey, the system that the reasons behind the data collection are sound. At the start of a student's enrolment, that trust is not there, and that is one of the most useful times to have access to this data.

There are other indicators one try to extract this sort of information from - attendance, participation, performance. But the accuracy of those might not be great, particularly without a baseline to measure against.

Thursday, July 19, 2012

Crash and burn

Over a month since the last weekly review - hardly ideal. There really needed to be one at the end of the previous term, but with reports eating into the start of the holidays, there wasn't quite the closure (nor the time). And bigger things to worry about during the break.

All indications look to a similar term ahead. I'll potentially be out of action for much of term 4, so planning, training and processes need to happen now, but I haven't been able to grab the ear of the powers that be to get this rolling. Will just have to persevere. Still recovering from a cold, so no thoughts for the week in this review; just planning and administrivia.

Thursday, May 31, 2012

Weekly wrap up

A very thoughtful week - discussing and digesting the nature and purposes of data in a school environment. A number of interruptions - from weeks past going on PD and a study tour to Bendigo, to various meetings with a variety of people (Monash Security, a Swiss International School, Samsung, Monash eSolutions strategic consultant, Cisco wireless engineers, and then some).

Any scheduled meeting inherently affects what I do on that day. I won't get started on something I know could take hours if a meeting is coming up, even though I'd likely get interrupted and not spend hours on it anyway. A little mental barrier to overcome, or work with.

Anyway, on to the thoughtful stuff: system control vs individual control. I've been looking at this through the lens of data collection, and more specifically, assessment. The more control put into the system, the more useful data becomes at a systematic level. Certain individual tasks can be automated and removed from the individual. Decisions can be removed from the individual (this can be a good thing as well as bad). Things can get consistent - from a student's (and administrator's) point of view this can be a great thing. The flip side is that systems bring momentum, and become hard to change. Individuals can experiment, try new things when there are fewer system constraints. Our school has long been down the individual (or small group) end of the scale. Teachers and subject groups can use what they want, adapt how they see fit. Play around with stuff, and keep what works. The drawback is in sharing what works, and building a useful body of student data - with no consistent performance measures, this is difficult to achieve. The next question is to ask to what end this data is being collected and used - I can see in a typical school why this is done, but JMSS ain't no typical school. Also, I need think about how badges can fit into this equation.

Wednesday, May 30, 2012

Assessment indicators

In our discussions on educational data analysis and reporting, we've been looking at what data to collect and display, and what to compare it against. A student at JMSS will study some subjects in year 10. Those have some assessment tasks, and are also given VELS scores. Some subjects carry across both semesters, some don't. The student will study a completely different set of subjects in Year 11. Some will have a certain amount of carry-through, many won't. These subjects have assessment tasks, and VCE outcomes, which are pass/fail. Year 12 is similar, but slightly different again.

Any piece of data, to be meaningful, requires a context. It needs more data around it. That can be provided by time, by peer data, goals, and more. The effect of our situation is that there is no meaningful data to compare over time, nor for the student to set goals against. We could aggregate certain assessment data at the end of each semester, and use that, but a semester is a long time, and subjects are quite diverse, as are the assessments and marking practices even within one subject group. We could compare a student's datum against the cohort (e.g. place that student on a box-and-whisker plot), but that arguably doesn't achieve much - it's basically a league table, and this sort of competition can be counter-productive.

This is where VELS is really quite nice. It provides a consistent set of indicators across year levels. Perhaps we need to extend collection of VELS data beyond year 10, or develop our own set of indicators. This would increase teachers' assessment and reporting workload, and thus would need to be extremely well thought out, and in fact integrated. If every assessment task used a rubric that related directly back to these indicators, then the mere act of marking the assessment would cover that extra work. Developing these indicators would be no mean feat - curriculum frameworks take time to develop. But informal discussions have suggested at least two faculties have already been considering such a set of indicators - maybe it's worth investigating.

It is worth asking, if we can't use assessment data meaningfully, what is the purpose of that assessment?

(The data discussed here is what is directly linked to assessment items - i.e. markable things. We also have monthly progress reports which we can use the data for more meaningfully)

Sunday, May 27, 2012

The School and the Software Developer




JMSS have much of their eServices in the cloud, or running locally but created and maintained by an external provider. Basically, most things are commodity systems.

Department Systems

The Department tries to provide all systems that a school would need. Unfortunately, being a large bureaucracy, it doesn't do a particularly good job fitting to any one school's needs. Nonetheless, there are certain mandated IT systems in place, which we have to use, or at least interface with. Smaller school tend to rely on these systems due to financial reasons.

Commodity systems

Things not covered by the department, or not done well are often outsourced. This might be a service in the cloud, a service hosted locally but maintained remotely, or hosted locally with support provided. Some of these may fill the exact niche required. They tend to be written by organisations specialising in that type of software or service, so have plenty of experience of multiple clients to go off. Some may fit well enough, or provide some flexibility in their implementation. Others fit well enough at the time of commissioning, but can't adapt to the school's changing circumstances or requirements.

Home-grown systems

Where an outsourced system isn't available, is too expensive, doesn't fit, or isn't agile enough, schools can implement home-grown systems. This obviously requires a developer (or team thereof). While a home grown system can be tailored exactly to the school's requirements (even moving targets), the big down-side is maintainability. Programmers don't come cheap, and sysadmins who try their hand at programming may not be sufficiently skilled in software liffecycle management - documentation, maintenance, scaling. When the person who wrote the software leaves - what happens then?

This leads us to the role of the programmer in the school. There are three options:
  1. Settle for a less-than-ideal solution by using a department or commodity system.
  1. Work closely with a company to create/modify and maintain a custom system
  1. Employ a developer (or team thereof).

While option 1 might be the only way for some schools, I shall not seriously consider it here. Option 2 is a viable possibility, but most software development organisations work for business (there being all the money). They don't necessarily have an understanding of the needs of an education environment. Further, this options requires someone at the school to create a watertight specification of the service required. In doing that, you're half-way to option 3. Before embarking on a school driven software project, however, a spec is certainly needed, and processes need to be in place. Source code management, a documentation system and convention, and succession plan. My preference is for small components that work with other systems (be they home-grown, commodity, or department). Modularity leads to flexibility, and agility. 

Wednesday, May 2, 2012

On multiple systems




Before I get started, a comic:

(source: xkcd.com/927)

Not quite the same, but very similar to the multitude of ICT systems we use at JMSS. The Department of Education provides the school with CASES (the central school admin system), which does some things, but falls behind in access and UI. So some smart people developed Compass. In an earlier day and age, a school's techies might develop their own DB systems to keep track of their school's data (attendance, timetabling, reporting, finances, etc etc). But in-house developed software requires maintenance, which might not be feasible in small organisations like schools. I understand that Compass grew out of such a system, but the development and maintenance happens elsewhere. Great. Only, rather than replacing all of the existing systems, it adds to them. While it has messaging capabilities, people still use email. There is a schedule, but only for school events, and it doesn't gel with our calendaring system quite as nicely as you'd like. Purchase orders still require a print-out. We use a seperate timetable system, and reporting has been put on hold. Which is what I'm leading in to. We do semester reports using Accelerus - a dedicated system for assessment data reporting and analysis. But interim reports go to Compass. Compass doesn't do much with them except present them to parents and students, so one of our staff members developed an Excel sheet to present this data in a colourful format that can be filtered by student, tute group, etc, so tutors can get an idea of where their cohort is at. We've now cut off parent access to the interim reports, and are using them for school use only. So: Interim report data entry: Compass (web based). Interim report analysis: Excel. Semester report entry: Accelerus. Semester report analysis: Accelerus. Semeter report parent viewing: Compass. Is that too much? Should there be one system to rule them all, or does diversity bring resilience? If a teacher has to enter three different spaces to gather the data on a student/class/subject/etc, they might start forgetting about which is which. They might have different representations, or not 'link' together. Or perhaps I'm trying to dumb this down too much. It is a complicated problem. These aren't even all of the relevant data we have on each student.

The same goes for any other system. As soon as 'one system to rule them all' is implemented, another requirement arises that can't or won't be implemented in that system. Someone will come along and implement it independently. They might talk to some extent, but there are now two systems. Big systems have large momentum, and don't respond quickly to user needs. Small, independent, communicating systems (The Unix Way) help, but with heavily interlinked data, how do they fit in?

Tuesday, May 1, 2012

A meta-reflection


Most of the blogs I read are somewhat widely read. The blogger writes for an audience: an audience that is mostly unknown, perhaps loosely known through some comments/social systems, and maybe a few well-known readers. The style of writing is thus quite different for writing for oneself, or a small, well-known audience - among other things, context is a major thing. How much background info does one need to detail? Such reflection is certainly useful for later reading by oneself, and may at some later point attract readers, so in one sense, one should probably write as though one has an unknown audience. Even if that unknown audience is one's future self. Still, it feels awkward to be writing to many, when one knows that in fact very few (if any) people are actually reading it now.

I usually start a reflection spree with some meta-reflection such as this; some actual content will be forthcoming.