News, Courses and Scrutiny

I think I may have confused Stephen Downes yesterday with my notes around consultation based courses, so here are some more loosely connected thoughts that will probably only serve to muddle the situation further, at least for now…;-)

Take the forthcoming UK Parliamentary Communications Green Paper that will lead to a revision of the legislation surrounding communications in the UK. In part, this will draw on the DCMS Communications review carried out earlier this year according to the following process: “An open letter was published on 16 May 2011 asking a broad range of questions about the communications sector. All non-confidential responses to the letter were published on 7 December 2011. Submissions received will be used to inform the Green Paper.” (The public submissions are available as a individual documents in either RTF or PDF format.)

The open letter [PDF] included a series a questions relating to communications policy. For example:


Q6. What are the competing demands for spectrum, how is the market changing and how can a regulatory framework best accommodate any rapidly changing demands on spectrum and market development?

Q12. What barriers are there to innovation in new digital media sectors, including video games, telemedicine, local television and education?

In a consultation-framed course, the consultation questions may be thought of as part of the assessment model. One of the aims of the course is to provide “students” taking the course with the knowledge, skills and understanding required to provide a considered response to some or all of these questions.

Note that we may wish to qualify the reading of a question, or wrap it with additional criteria; for example, we might tune Q6 above along the lines of: “What particular issues are likely to arise in the 300MHz to 3GHz band?”, or something like that!

In the Related Information section of the Communication Review, links were provided to a Research report [on] the Contribution of the digital communications sector to economic growth and productivity in the UK and the Government’s broadband strategy among other things. In a sense, we have been gifted some “course readings”. There are also opportunities to dip into research that maybe doesn’t get read (or scrutinised) as widely as it might, in the form of Parliamentary Library Research Briefing papers.

So that’s part of the jigsaw: reviews, consultations, calls for evidence all involve policy makers soliciting evidence and opinion around a topic area that may include technical considerations. Where questions are asked, these may form part of the reflection/self-assessment/course assessment framework. The original call may itself be viewed as a high level syllabus of the topics to be addressed in the course. The course can then address these issues with reference to teaching material (for example, if we’re considering innovation, we met call out some introductory OpenLearn materials on “Characteristics of consumers and the market”.

Whilst the aim of the review, consultation or piece of proposed legislation may not in itself go too deeply into technical areas, it can be used to provide the SPEL (social, political, ethical, legal) context around a technology area and provide a jumping point off for a technical lesson in that subject area (for example, we may want to consider the similarities and differences between wired networks and wireless networks; or we may need to get up to speed on what optical fibre networks are good for.

Part of the story then, is to try to take the lazy route to curriculum development, and reuse someone else’s, which in this case also amounts to a repurposing of a document or process that wasn’t intended as a course to provide some of the content, topic, cohort discovering and pacing components of a course.

This repurposing lends an element of authenticity and relevance to the course of study (though as mentioned in my previous post, we must be wary that the course is not used as a vehicle for delivering propaganda).

What the approach may also do is increase the amount of scrutiny around a review or route to legislation. In the post No Minister: Any chance for the Communications Act?, Guardian Professional writer Dick Vinegar notes:

Last time around, in 2003, Lord Puttnam, a film director with the right blend of artistic and technical expertise, carried out a pre-legislative scrutiny. I believe that this knocked the heads of broadcasters (fluffies) and comms engineers (techies) together to produce a good bill. From what I have heard so far, I am not sure whether this time around we will get such a mature, ‘two cultures’ approach.

By providing a view over a consultation, or review that is course-like, we can maybe increase the amount of scrutiny involved in the process and also (maybe) deepen people’s understanding of the issues.

The course view thus provides a structured pathway through the relevant issues at a deeper level than provide by the typical supporting documentation, or perhaps just in a more reflective way. The course also provides a way in to citizen engagement from individuals who just want to explore the topic.

The consultation-framed course also provides a way of straddling news and academia, an area that has also interested me in a lifelong learning context for some time.

This could manifest itself in a couple of ways. For example, long form news articles could feature “academic” breakout boxes using OERs sourced from the course, or course discussions could be positioned around issues raised in recent news articles; in a wider context, entry routes to the course may be provided through the news media, from readers who want to know a little more about the issues involved within a particular consultation area (c.f. News, Analysis, Academia and Demand Education or Educative Media?).

Another interesting feature that arises out the consultation based course learning journey is that “authentic assessment opportunities” present themselves: for example, a student may submit an actual response to the consultation, or, if they entered via the news route, write a letter to the editor. Writing responses in the form of research briefing papers also provides another format for producing work that may be used to demonstrate understanding and knowledge in a meaningful and potentially useful way, as well as an assessable way.

The tone with which reviews or consultations are presented is also interesting from an educational perspective, in that the questions that are asked may be open and may not have a single right answer. (On the other hand, in calls for technical expert evidence, there may well be “correct” answers which the evidentiary call is intended to discover.) This frames the learning activity in the context of “we don’t know what the right answer is, but we need to find out/learn more. That is, the consultation is in some sense modeling part of the lifelong learning behaviour we want to inculcate in our students (learning is not just for school or university, right?!;-)

Is there a demand for such an exercise though? Again referring to the Guardian Professional article:

In the run up to the green paper, Westminster has been awash with conferences and seminars with titles like ‘What should be in the new Communications Bill?’ and ‘Dear Jeremy…’ (Hunt). Most of the speakers at these portentous events have been full of patriotic hyperbole and statements of the obvious. “The next Comms Act should focus on ensuring that the UK’s communications sector remained one of the most competitive in the world.” “A level playing field is needed in the internet ecosystem with global issues considered carefully.” “Regulation must not chill innovation.” “The limits of online privacy must be defined.” “Children must be protected.”

PS I mentioned in the previous post how at least one of the forums around the forthcoming Communications Green Paper was “CPD certified”. A little digging turned up The CPD Certification Service, which is presumably what that referred to. Anyway, I’ve added it to my watchlist to see if Pearson, or other companies of that ilk, start sniffing around it as a gateway to one possible new credentials market…

PPS Are there any emerging leaders in the qualification verification arena yet?

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

5 thoughts on “News, Courses and Scrutiny”

  1. Tony

    This looks very interesting to me. I’m currently involved with Middlesex University in writing a work-based learning module on courses information management (based on XCRI). One of the challenges of designing the curriculum is that it has to have a shelf-life of several years, during which time things will have moved on considerably – it’s quite likely that technologies will have changed the domain significantly. Ideally it would be very useful to get the learners to reflect on these possible changes, and to comment on them as part of assessment. This could be very similar to your ‘consultation-framed’ approach.

    But how do we handle this kind of assessment (as summative assessment), when we don’t necessarily have a yardstick to measure against?

    Alan Paull
    alan@alanpaull.co.uk

    1. @Alan where you want to provide ongoing training/current awareness around something like a standard, which is subject to change, there are a couple of issues, I think:

      1) getting an initial understanding of the standard
      2) keeping up with changes to the standard.

      I was at a panel meeting last week looking at data standards, and a comment was made there that where new versions of standards are release as a complete reprint, folk are unlikely to read them. Instead, what we need to do is communicate the diff(erence).

      I think where assessment is concerned you can write an assessment along the lines of: “how has the standard changed between previous.version and current.version, and what impact is this likely to have on x, y z”

      It may also be possible to frame assessment around:

      1) identifying original assumptions, particular as they related to the climate at the time the assumptions were made;

      2) comparing the current situation with the situation then to see whether or not the assumptions still hold;

      3) identifying which elements of the standard (or whatever) were tightly linked to one or more assumptions;

      4) identifying how different parts of the standard are dependent on other parts of the standard, and seeing what side effects changing assumptions might have as they ripple through dependent parts of the standard,

      In this way, you focus attention on the mutable parts of the standard that are dependent on, or influenced by external factors. (Of course, I suspect standards makers would like to think their standards are purely abstract creations that exist largely outside of time, space and assumption;-)

Comments are closed.