3rd year projects 2017 - 18

Mail all OAwoniyi1@sheffield.ac.uk dheard2@sheffield.ac.uk hhowarth1@sheffield.ac.uk AAChaudhary1@sheffield.ac.uk bfmillward1@sheffield.ac.uk AGkigkolian1@sheffield.ac.uk AMarketos1@sheffield.ac.uk

The project descriptions below are only intended as starting points. If you wish to discuss possibilities in greater detail I encourage you to email me to arrange a meeting.


JPB-UG-1: Climbing technique analysis

Description

This project is suitable for someone with an interest in sports climbing and computer vision. The idea is to build a tool for analysing climbing style from video recordings captured on indoor bouldering walls. In particular, the tool would allow the user to record multiple attempts at a boulder problem and to overlay the videos sequence to form a composite showing both climbs at the same time. This would allow the use to examine small differences in technique, i.e. either comparing the technique of two different climbers or for analysing improvement in performance over time using multiple recordings of the same climber. Note, the system can be used in other sports, e.g. something similar is used in TV presentation of skiing where multiple runs are composited so that the viewer sees two skiers on the same course. However, indoor climbing is particularly suitable because the routes have a fixed geometry that make the alignment problems easier to solve.

Further reading

The project will use Computer Vision tools such as OpenCV or the Google Vision API. There is no need for prior experience with these tools

[TOP]


JPB-UG-2a: Sound event detection for smart cars (CS/AI/Maths)


JPB-UG-2b: Sound event detection for smart cars (CS/AI/Maths)

Description

Sound event detection is the task of trying to detect the presence of a certain sound (e.g. a door slamming) in a continuous audio recording. This problem is a key component of `scene understanding’. Solutions have many potential application, e.g. searching and retrieval of audio recordings, machine listening for autonomous robots, audio-based monitoring and surveillance systems. This project will build and evaluate novel solutions to this problem. We will adopt the evaluation framework provided by Task 4 of the DCASE2017 machine listening challenge which focuses on sound events relevant to the design of smart cars.

Further reading

For further details see here,

Prerequisites

  • An interest in AI / machine listening
  • Python programming skills
[TOP]


JPB-UG-4: Web-based tools for browsing a multichannel, conversational speech database

Description

We are currently planning to record a 100-hour database of 4 person dinner party conversations. This database will be recorded with multiple microphone - microphones worn by participants and microphone arrays that record the scene from a distance. The speech in the database will be fully transcribed with the start and end time of each utterance provided. We plan to use this data for the next in a series of speech recognition evaluations known a the CHiME challenges.

This project will design on-line tools that let users browse this database interactively. For example, users will be a able to scroll through the speech annotations and select utterances to play back. They will be able to choose which microphone to listen to. Ideally they will also be able to edit the transcriptions and fix errors.

The system will need to be designed to operate inside a web-browser, e.g. using a javascript library such as React.

Further reading

For further details see here,

Prerequisites

  • Client-side web programming skills
  • An interest in audio signal processing
[TOP]


JPB-UG-5: Using simulated environments to learn real world navigation (CS/AI/Maths)


JPB-UG-7: Data-driven project with Emotiv BCI Headset (CS/AI/Maths)

Description

The Department has purchased ten Emotiv EPOC+ `neuroheadsets’ These are state-of-the-art wireless EEG headsets that can be used for developing advanced brain computer interface (BCI) applications. The Emotiv website provides a lot of information about the capabilities of the headset and describes some example applications. It also provides some free software demo. I am looking for a student with a novel idea that would apply techniques learnt in COM2004 to Emotiv data in an interesting way.

If you have an idea please email me so that we can meet to discuss.

Further reading

For further details see here,

Prerequisites

  • An interest in Brain Computer Interfaces.
  • Python programming skills
[TOP]


JPB-UG-9: Learning to drive in OpenAI Universe (CS/AI/Maths)

Description

Self-proposed project.

Further reading

For further details see here,

  • xxx
[TOP]