AI in Education
Posted on: 30 May 2019 by Tunde Varga-Atkins in Conference & Event Reports
Tunde Varga-Atkins attended a symposium on 1st May 2019 at Birmingham’s ThinkTank and provided this report.
What is the event?
My interest was twofold in attending the Artificial Intelligence (AI) Chatbots in Higher Education symposium organised by JISC, UCISA and Birmingham City University. Firstly, to get inspired by what other institutions are doing and bring these back to our University. Secondly, I also wanted to see if there were any ways in which we educational developers and learning technologists could make use of AI in our work? Participants of the symposium were both university members, and vendors including Oracle, Amazon Web Services and Microsoft, who were there as exhibitors and also invited to the final panel discussion. So, what have I learnt?
An overview of the day:
During the symposium, I learnt that a number of universities are making progress in incorporating chatbots and digital assistants as part of their student support, and also that these developments are financially viable without much upfront investment, as also assured by the above vendors. Various academic presenters were keen to emphasise that it is the ‘why’ and ‘how’ questions that drive (and should) these developments. The main use cases of AI focused on offering transactional support, probably the easiest wins for universities, however, more advanced, disciplinary uses of AI are also being developed. I will discuss Bolton and Lancaster’s journey so far with AI chatbots.
Bolton University's 'AskAda' digital assistant/chatbot service to students responds to frequently asked questions, such as ‘What is my university ID?’, ‘What is my timetable for this week?’ etc. They have done so for 3 years now. Their data show that most questions are asked at the beginning of the year. Using chatbots has increased their retention in this important initial period with 3%. Bolton’s experience – as confirmed by various speakers during the day – stresses the importance of getting your university data and processes right first; this is the bulk of the work; putting the AI interface and service on top of this proved relatively easy. This was also confirmed by vendors. The three main areas they currently use Digital Assistants at Bolton is for:
- general student support questions,
- working with student data set and
- finally, an area being developed is around offering subject-specific/disciplinary support e.g. in maths or hairdressing.
The power of using AI is linking personalised data with ‘nudging behaviour’. For instance, if a student’s attendance falls then they may get a prompt from Ada signposting them to support, e.g. suggesting that they meet their tutor. Although these are potentially powerful uses of AI, speakers and the audience were also keen to point to considerations for safeguarding, privacy and security during the subsequent panel discussion. Careful design of these services is paramount, for instance, thinking through when/where virtual support might need to quickly escalate to emergency personal contact and so on. During the design of student service developments cross-platform access need to be considered, whether via laptops/PCs’ web interfaces, mobile platforms or voice control. Bolton’s Ada does all of these – but this poses challenges for the designers and content developers: a link to a video tutorial might be okay for a web user, but would not work for someone using voice control.
Robots in Birmingham’s ThinkTank Museum of Science
Lancaster University, inspired by Bolton’s example amongst others has also developed a multi-platform chatbot service: 'Ask L.U.' [Lancaster University]. This already works with students’ iLancaster mobile app and web platform, and Lancaster are currently working on integrating their Digital Assistant service within their intranet/student portal. Interestingly, one of the most frequent questions asked by students was about shop, pub and other opening times on campus. The university worked with their on-campus shops to gather this information, which had not been readily available, and in return offered to give this data back and embedded it in their respective websites. This bartering benefited both camps, but especially the students.
From both examples, it was apparent that institutions might start with a general student service (FAQ)-type AI-platform, only to quickly broaden their horizon to developing other bespoke services. Mental health, support for students with disabilities are just some of the most popular ones. As Chris Dixon pointed out, universities need to be able to offer customisable chatbot applications to different units and departments, so that they could build their own knowledge-base. All these are aspects of exciting future developments.
The panel discussion touched upon topics of mentioned above on security, privacy, the student support perspective, personalisation and offering parallel services with different degrees of automation: from digital chat service manned by real people (shown to be popular with Bolton students) to completely automated Digital Assistants. The panel discussion also included a linguist from Birmingham City University (BCU), and it was interesting to note the role of linguists in offering advice on conversational analysis to the development of these services. For instance, Lancaster’s ‘Ask L.U.’ project involved students and linguists to develop an identity for their Digital Assistant to speak in a student-friendly manner. The Dean of BCU’s student affairs services talked about an exciting use case for an AI-bot which would explain academic regulations to students, for instance students could ask information about assessment regulations. A number of speakers during the day highlighted then the need to ‘translate’ academic jargon into student-friendly language, and this would be especially true for creating a knowledge-base / FAQ-type service for students who would want to ask about exam regulations or similar issues.
What did you get out of the symposium?
In the spirit of academic criticality, I also offer some critical reflections on the day:
- What I was missing from the panel discussion is how users of chatbots are communicated not just responses, but where the information has come from. This might not be an issue for factual type of question (presuming the university’s data has been cleaned and sorted), such as my timetable for today. One presenter discussed that an AI system is only as good as the data is fed back into it (garbage in-garbage out). Part of the argument for AI was that information many support failures are down to human error, people using the wrong versions of documents e.g. for re-sit regulations. So how can we be sure that the AI system has the most up to date information? And how can the student tell if the information received is likely to be right or not? From my perspective, I really like the work of our own Professor Katie Atkinson, Dean of Computer Science and Electrical Engineering, who is working with the legal profession to develop AI systems for automated decision-making. A central feature of this work that lawyers require the decision-making of AI systems to be transparent (as opposed to a black box churning out the decision, without the receiver being able to look into it and check).
- Rigorous research and evaluation needs to be built in to the development of these services. Some anecdotal information was given about students finding these AI services useful, or being able to ask embarrassing questions (such as ‘what is my email address?’), with which they would struggle to approach computing services for instance. However, because potentially students could disclose information or those most shy/disadvantaged could be using this service, it would be great to hear more about how institutions monitor, evaluate their AI/Digital Assistant service and how they may go about putting in place quality, safeguarding, safety and support/developmental measures. In addition, as Paul Richards highlighted in his introductory speech, if creativity, problem-solving and emotional intelligence are the capabilities we will be needing from students, and then if students can rely on chatbots for all their information, how they will be learning about human interaction? And more importantly, if the people who will programme the data and algorithms for the Digital Assistants have been able to get away without having to interact, how will they be able to ‘programme’ these features in for the digital assistants?
- Given that we talked about automatisation and digital technologies, it was interesting that there was no Twitter # for the symposium advertised or indeed no in-class polling technologies (pollev, menti or padlet) to post questions to the panel or the speakers: we had to write questions on a piece of paper during breaks. I was having Twitter-withdrawal symptoms! This reminds me of Chris Jones’ research who found that people who might be very tech-savvy in certain areas but be patchy in others.
- Finally, this field needs more female representation, not just in the audience but also in the panels and presenters. With one female student and one female dean of student support services, the rest of colleagues were male presenters and panel members. I would have loved to hear more about the female researcher’s research areas for instance. There is room for more impro(women)t!
Final thoughts:
Finally, some thoughts on how we could use AI at LivUni, inspired by ideas on the day:
- Starting off with general student support services (timetabling, campus map, availability of PCs on campus, campus facilities, opening hours).
- Also developing more specialised student support services, e.g. support for students with disabilities, widening participation, support for mental health and wellbeing, and for co- and extra curricular opportunities – these could all benefit from Digital Assistants who are available 24/7.
- Further, assistants could be developed to help students interpret academic regulations (e.g. assessment guidelines, mitigating circumstances etc.) and signposting them to more specific local/departmental/academic advising support.
- There is lots of scope to develop staff-facing services, e.g. examples during the day mentioned a PDR chatbot, which enables staff to ask Qs about PDR process and be prompted with answers. Quality assurance processes and requirements seem a natural candidate to become data sources for AI bots.
But:
- It was clear that to progress AI services, we would need to get our data in order first.
- We also need to consider from day one issues around ethics, privacy, safeguarding and security.
- It would be important to offer students the ability to control what data is collected and shared with/out their knowledge.
- And of course, how to address challenges around inclusivity, such as multilingual support and for support to be cross-culturally context sensitive.
What AI services are you aware of in educational institutions – or what services do you think might be worthy exploring for AI chatbots?
Keywords: AI, Artificial Intelligence, Chatbots, University, Student Support.