Tagged: flipped

Pause for thought: Flipping in study skills into the curriculum

There have been a number of posts on this blog around implementing flipped classroom learning designs. As these have been released, conversations start, and again the topic comes back to evidence of impact.

Over time there are emerging a number of different teaching and learning models under the banner of the ‘flipped classroom’. These offer exciting opportunities in terms of widening engagement and impact, but also raise concerns around student motivation, and for staff, issues around workloads, skill sets and changing roles.

The flipped classroom is a “pedagogical model that employs asynchronous video lectures, reading assignments, practice problems, and other digital, technology-based resources outside the classroom, and interactive, group-based, problem-solving activities in the classroom.” (Hawks 2014:264).

Importantly, “although there has been little research on the educational outcome as it relates to whether the flipped classroom increases student learning, there has been a lot of indirect research (eg, student and instructor satisfaction surveys) promoting this approach” (Gilboy et al., (2014:110)).

Gilboy et al., (2014) identified in their large scale HE study a very positive perception from students on their learning experience in a flipped model. Evidence is starting to emerge of improvements in examination scores (although not student satisfaction) with flipped teaching models (Missildine et al, 2013). This is really interesting, the empirical evidence of improving examination performance, but students not necessarily liking the teaching model. This might be expected, an effective flipped classroom model does require more continuous work from the student.

While, the learning design within a flipped model also needs to change, Brunsell & Horejsi (2013) argue that showing video lectures alone is not flipping your classroom. They propose the discussion around student motivation and participation require the need to add an active learning experience.

Disappointingly, there has been little published empirical work around flipped classroom techniques for the delivery of study skills within Higher Education.

It is this area of deficiency where people development teams need to place the emphasis. We need to start evaluating the impact of these emerging models on student and staff learning. We are people developers who design learning programmes, the question is, are we able to capture the impact?


  • Brunsell, E. and Horejsi, M. (2013) ‘A flipped classroom in action’, The Science Teacher, 80(2), p. 8.
  • Elevate | University Campus Suffolk (2015) Elevate Team. Elevate | University Campus Suffolk. Available at: http://ucselevate.blogspot.co.uk/2015/02/flipping-in-study-skills-ict-literacies.html (Accessed: 17 February 2015).
  • Gilboy, M. B., Heinerichs, S. and Pazzaglia, G. (2015) ‘Enhancing Student Engagement Using the Flipped Classroom’, Journal of Nutrition Education and Behavior, 47(1), pp. 109–114.
  • Hawks, S. (2014) ‘The flipped classroom: now or never?’, AANA Journal, 82(264).
  • Missildine, K., Fountain, R., Summers, L. and Gosselin, K. (2013) ‘Flipping the Classroom to Improve Student Performance and Satisfaction’, Journal of Nursing Education, 52(10), pp. 597–599.

Flipping into a taught course, a flow diagrams study skills package

The video gives more insight into flipping in a study skills session in a taught course at UCS. It looks (first impressions) to have been very successful. There are 42 students on the course,  of which 29 submitted the pre-session activity. Assuming they all pass, it will mean the much shorter workshop will have 13 people. Meaning this will be shorter, focused to individual need and hopefully more effective for the individual learner.

We’ve looked at reducing the workload with this roll out. So it included a Departmental colleague (Tom Ranson) – collaborative development model, the re-use of some existing guides, and the use of third party materials. The assessment component uses Blackboard’ Rubrics.

Flipped Classroom: Study skills – Creating a flowchart

I thought it might be useful to talk through the next iteration of use moving ICT Study Skills workshops to an online model. This builds on the previous posts around developing flipped classroom models for excel, with the underpinning principle, live is a normal distribution, therefore, 80% of our students can already complete this task. So, our face to face session should be for those who can’t, and need help.

The general session plan is outlined in the embedded flow chart. This might have a few tweaks, as I’ll need to finalise stuff Tom.

Some enhancements on the previous model include;

  1. clearer sign posting in an earlier lecturer about the task and the decisions they need to make
  2. we are going to recommend specific software (http://www.lucidchart.com)
  3. the online support material will be more structured (see http://ucselevate.blogspot.co.uk/2015/01/getting-started-guides.html)
  4. both the online learners and the face to face workshop learners will need to submit their flow diagrams
  5. the submission process has been enhanced, it will involve them agreeing this is their work, and they can complete the tasks. It will also include a few multiple choice questions / free text on what buttons to push to complete the task. These questions will be a limited number from a question bank.
  6. Open badges will be provided as recognition they have completed this in-curriculum activity

What are the outcomes of the Excel flipped classroom model?

A many of the recent posts have been around expanding what a flipped classroom model might mean as an alternative to a traditional delivery session. This design was based around developing lower order learning skills in being able to use Excel to complete three distinct tasks. The intention being to ensure people have the required skills, and are aware of the help routes. In addition, make sure those who attended the face to face session are for those who need the help, so aiming to use the pre-session online tasks to filter out those who can complete the tasks, and therefore, do not need to appear.

From the perspective of the session facilitator the learning model was effective;

  1. 75% of the group where signed off with the skills before the taught session. So they did not needing to attend
  2. the taught session was reduced from two hours to one hour
  3. 5 people attended the face to face session, and could get direct / personalised help. Infact, 4 out of the 5 had completed most of the tasks before they attended
  4. using the marking criteria (rubric in Blackboard VLE) it took a maximum of 1 minute to assess and mark of each submission
  5. the learning materials (spreadsheet, and support videos) and version on the design as been re-used on another course, so reducing future authoring time

I have received some feedback from the students on the experience, only 3 … so take with a pinch of salt 🙂

Two out of the three strongly agreed their knowledge and skills of using a spreadsheet to calculate descriptive statistics (mean, mode, media, standard deviation), plot bar charts and scatter plots with trend lines had improved since completing the online activity. When asked why, a response was “because the activity was explained step by step I didn’t miss anything out and so was more confident with the end result”

When asked, would they prefer attending a one hour workshop or completing the task online, interestingly, two out of the three said they would prefer attending the face to face session. The rationale being feeling an expert would answer questions, re-assure them and help them find short cuts. So, the question would be, within this design and cohort how can we build support networks? The hope would be they’d support each other.


Excel support and the self diagnostic quiz (step 3)

I’ve recently been posting a lot around developing an online provision which will allow us to roll out skills development around Excel (and other IT packages). This has focussed on the idea of setting a diagnostic quiz, which if they reach the minimum requirement (80%) they are not required to attend the lecture. The driver being a large number of these students will already by able to complete the required skills / competencies around using the tool (our using the support material they can learn). We need to spend time on those who are struggling.

Overall, it seems to be going very well. I have worked through a simple communication plan which aims to encourage students to complete the quiz (perhaps they are strategically motivated by, do this or you need to attend the lecture). While the marking is very low resource. I’ll improve this for the next iteration (end of October) by including a pool of objective questions based on the data set. This means they can get feedback asap.

The next question, must be … what happens if they don’t do it, or if they don’t make the 80% pass mark? Well, the plan is very simply, given the numbers should be low, I’ll set them in groups, give them the task, check the outputs and sign them off. The handout will have info about how to access the online support materials.

The decision tree is ….

1) Did they complete the online diagnostic?

  • Yes
    • If yes, did they pass?
      • if yes, sign off
    • If no, email them they need to attend the session
  • No
    • Email them to attend the session

2) Did they attend the face to face session, and complete the task?

  • Yes
    • Update the report register
  • No
    • Update the report register

3) Email Report Register to Module Leader

Given this is a skills / competency activity, these can be quickly rolled out, and we are much more flexible in terms of who runs the face to face event

The devil is in the detail … what is in the online quiz?

A couple of people have said, they are happy (ish) with the flipped classroom for excel skills, but what do you actually ask (within the quiz) and how long does the assessment take?

Given the task is a skills development session to ensure they can perform some introductory tasks in Excel. The quiz questions include;

Three objective questions  – auto marked (based on the calculated descriptive statistics)

  • Calculated Numeric Question: To two decimal places, what is the mean value for Experiment B?
  • Calculated Numeric Question: To two decimal places, what is the standard deviation in Experiment C?
  • Calculated Numeric Question: To two decimal places, what is the range in Experiment B?

Two subjective questions (based on plots)

  • The standard deviation in Experiment C is greater than in Experiment B, what does this mean in plain English?
  • What can you infer from the scatter about the relationship between the temperature and gas Z emissions?

Upload their spreadsheet

  • Quick visual check

This works very well. To speed up the checking process for the file upload and the two subjective questions, I use a very simple rubric (marking criteria) within Blackboard. The student can access this rubric before they submit. The idea is not too make this complicated, the broad intended learning outcomes are; 1) ensure they can calculate descriptive statistics, create a bar chart, create a scatterplot with trend line and r2 coefficient 2) ensure they know where to find online help.

Therefore, the marking criteria (rubric) allows me to click the grade classification, and reflect they technical competencies. For instance, the rubric for he file upload is …

Part A: Scatterplot: 

  • 20 points – scatter plot is included, for two experiments on one plot, they have included the trend lines, the labels / titles are included
  • 15 points (minimum pass for question) – scatter plot is included, for two experiments on one plot, but no trend line, however, labels and titles are included
  • 0 points (need to attend workshop) – no plot, or wrong plot or not including two experiments, axis not labelled.

Part B Bar Chart: 

  • 20 points – bar chart is included, for all three experiments on one plot, they have included the labels / titles are included
  • 15 points (minimum pass for question) -bar chart is included, but not all three data sets are included, however, labels and titles are included
  • 0 points (need to attend workshop) – no bar chart, wrong plot or axis not labelled.

Using the Blackboard Rubric within the Quiz (test) this means the subjective questions can be marked quickly. I’d suggest, marking the assignment for one person (including all the feedback) takes about 1 minute.

Flipped classroom model for acquiring Excel skills. Or why a traditional teaching model doesn’t cut the mustard

Since we’ve been restructured, and formed the wider Learning Services Group at UCS, we’ve been given a great chance to redesign the acquiring excel skills sessions we run. The traditional model at UCS was for students to attend a lecture (show and tell) format from a member of Learning Development, with a supporting handout. This would cover a range of topics from using excel for descriptive stats, simple data visualisation (bar charts) and more complex approaches, including scatter plots.

However, I would challenge this model on various grounds. In particular,

  1. is it really the most effective way to teach someone excel skills? The learner is relatively passive within this model
  2. it does not account for previous knowledge and skills. Why does a person need to attend a teaching session if they already have the skills, and can demonstrate them? Surely, we should focus our limited resource on having the greatest impact?

So, the new learning model is based around a flipped classroom concept. The primary aim is to ensure the generic skills are acquired and build on what they already know. So, just help them work it out. The following outlines a simple model we’ll apply for two programmes at UCS.

Pre-session activity within their LearnUCS (VLE) Module


  • Provide the activity descriptor, and the data set (downloadable as excel file)
  • Provide an authentic support video on how to complete the task (use different data but same layout) <www.youtube.com/embed/HtEdUS653Dw>
  • Quiz – include a set of objective questions based on the values they’ve calculated, a set of subjective questions based on the broad interpretation (scatter plots, with line of best fit), and a file upload question to see their finished spreadsheet
  • If they pass (criteria to be set), they do not need to attend the session, and will receive an open digital badge as recognition. If they do not pass, they will be contacted about attending the surgery session


  • Run as Q&A. It will replaces the lecture, and will include a few members of the Learning Services Team to enable small group sessions.

Post Session activity with their LearnUCS Module

  • Provide the completed (correct) solution as a video, so can watch it and self-reflect.


Image – with thanks: http://upload.wikimedia.org/wikipedia/commons/2/27/Excel_chart.PNG