News-Blog (work)

Neural Style Transfer for Computer Games

neural style transfer pipeline for games

Eleftherios presented the paper: Eleftherios Ioannou and Steve Maddock, Neural Style Transfer for Computer Games, British Machine Vision Conference 2023 (Workshop on Computer Vision for Games and Games for Computer Vision)

November, 2023

PhD congratulations

Congratulations to Fatimah Alzahrani for a successful PhD viva on "The Effectiveness of Facial Cues for Automatic Detection of Cognitive Impairment". The PhD examiners were Prof Moi Hoon Yap (Manchester Metropolitan University) and Dr Po Yang (Sheffield). The supervisors were Prof Heidi Christensen and Dr Steve Maddock.

Sep, 2023

PhD congratulations

Congratulations to John Charlton for a successful PhD viva on "Constraint-Based Simulation of Virtual Crowds". The PhD examiners were Dr Peter Lawrence (Greenwich) and Dr Dawn Walker (Sheffield). The supervisors were Prof Paul Richmond and Dr Steve Maddock.

July, 2023

AR, Robots and Safety

AR sectional cylinders around a robot arm

Paper published: Yunus Cogurcu, James Douthwaite and Steve Maddock, A Comparative Study of Safety Zone Visualisations for Virtual and Physical Robot Arms Using Augmented Reality Computers, 2023, 12(4), 75; https://doi.org/10.3390/computers12040075

The work was also chosen as the Cover image for Computers, Volume 12, Issue 4 (April 2023) [https://www.mdpi.com/2073-431X/12/4]

April, 2023

Neural Style Transfer

neural style transfer - Van Gogh style

Paper published: Eleftherios Ioannou and Steve Maddock, Depth-Aware Neural Style Transfer for Videos, Computers 2023, 12(4), 69; https://doi.org/10.3390/computers12040069

April, 2023

AR and safe human-robot collaborationn

AR sectional cylinders around a robot arm

Paper accepted: Yunus Cogurcu and Steve Maddock (2023), Augmented Reality Safety Zone Configurations in Human-Robot Collaboration: A User Study. ACM/IEEE International Conference on Human-Robot Interaction (HRI 2023), March 13-16, 2023 Stockholm, SE, late-breaking report (accepted)

January, 2023

CompoundRay

heterogeneous ommatidial acceptance angles

Paper accepted: Blayze Millward, Steve Maddock, Michael Mangan, CompoundRay, an open-source tool for high-speed and high-fidelity rendering of compound eyes [eLife]

October, 2022

The Biophilic Garden

image of the biophilic garden AR app

Our AR app is in use as part of the Biophilic Garden as part of Festival of the Mind 2022. This was developed as a result of a SURE project at the University of Sheffield: Benedict Barrow developed the app in summer 2022 under the supervision of Steve Maddock. The AR app allows visitors to explore a connection with the natural world via 5 pathways – compassion, beauty, emotion, meaning and senses. Team: Dr Chris Blackmore, School of Health and Related Research; Richard Nicolle, Garden Up; Benedict Barrow, Department of Computer Science; Steve Maddock, Department of Computer Science; Professor Miles Richardson, School of Psychology, University of Derby; Andrew Hall, 3D Artist. [link]

September, 2022

Neural Style Transfer

neural style transfer - Van Gogh style

Our paper at CGVC 2022 received the Terry Hewitt Prize for 'Best Student Technical Paper': Eleftherios Ioannou and Steve Maddock, Depth-aware Neural Style Transfer using Instance Normalization [pdf (White Rose Research Online)]

September, 2022

AR, Robots and Safety

AR sectional cylinders around a robot arm

Our paper at CGVC 2022 received the Rob Fletcher prize for 'best student application paper': Yunus Cogurcu, James Douthwaite and Steve Maddock, Augmented Reality for Safety Zones in Human-Robot Collaboration [pdf (White Rose Research Online)]

September, 2022

Nature Communications paper

segmentation of bird plumage

He Y, Varley Z, Moody C, Nouri L, Jardine M, Maddock S, Thomas G, Cooney C. Deep learning image segmentation reveals patterns of UV reflectance evolution in passerine birds. Nature Communications, 13, Article number 5068, 2022 [pdf]

August, 2022

Cultural Industries Research Network Launch

Sumo Developer Conference sign

I gave an invited presentation "Computer Science and the Video Games Industry: A collaboration" at the launch of the University of Sheffield Cultural Industries Research Network. [link]

July, 2022

SDC

Sumo Developer Conference sign

The Sumo Developer Conference came to Sheffield: "700+ people from 11 Sumo Digital studios across the UK, Europe and India" [SDC]. Some of our students, together with students from Hallam University and apprentices from Sumo Digital Academy presented their work in a poster demo at the start of the conference.

June, 2022

Video analytics

video analytics keywords

Paper published: Tianhao Zhang, Aftab Waqas, Lyudmila Mihaylova, Christian Langran-Wheeler, Samuel Rigby, David Fletcher, Steve Maddock, Garry Bosworth (2022). Recent Advances in Video Analytics for Rail Network Surveillance for Security, Trespass and Suicide Prevention - A Survey. Sensors [pdf]

June, 2022

Sumo visit

Sumo logo

I took a group of students to Sumo's offices in Sheffield for a talk by Dr Jake Habgood and a tour of the Sumo Digital Academy offices.

April, 2022

In2stempo

blast visualisation

Our work package for the Horizon 2020 project In2Stempo with Network Rail, EU Shift2Rail programme has finished. Team: Fletcher (PI, MechEng), Rigby (Civil), Mihaylova (ACSE), Maddock (COM). As part of this work Harley Everett worked with Steve on creating 3D pdf documents for visualisation of blast simulations in a railway station.

March, 2022

Eye Blink Rate

facial features

Paper published: Fatimah Alzahrani, Bahman Mirheidari, Daniel Blackburn, Steve Maddock and Heidi Christensen (2021). Eye Blink Rate Based Detection of Cognitive Impairment Using In-the-wild Data. Proc 9th International Conference on Affective Computing and Intelligent Interaction (ACII 2021) [pdf (White Rose Research Online)]

September, 2021

CompoundRay

heterogeneous ommatidial acceptance angles

"CompoundRay is capable of accurately rendering the visual perspective of a desert ant at over 5,000 frames per second in a 3D mapped natural environment": Blaze Millward, Steve Maddock and Michael Mangan. CompoundRay: An open-source tool for high-speed and high-fidelity rendering of compound eyes [bioRxiv]

September, 2021

Sketch-based control of crowd simulations

Sketch-based control of a crowd simulation

Paper published: Luis Rene Montana Gonzalez and Steve Maddock, A Sketch-based Interface for Real-time Control of Crowd Simulations that incorporate Dynamic Knowledge. Journal of Virtual Reality and Broadcasting, 16(2019), no. 3. [pdf].

August, 2021

UK-RAS #Manufacturing #Robotics Challenge 2021

robot simulation

Yunus Cogurcu used his PhD knowledge of VR, robotics and safety to help create the simulation used in the UK-RAS #Manufacturing #Robotics Challenge 2021: [YouTube]

July, 2021

Deep learning and bird plumage

bird plumage segmentation

Some of the work Yichen He did for his PhD thesis on using deep learning for segmenting bird plumage is being written up for publication: see bioRxiv

July, 2021

Facial features and cognitive impairment detection

facial features

Paper accepted: Fatimah Alzahrani, Bahman Mirheidari, Daniel Blackburn, Steve Maddock, and Heidi Christensen. Eye Blink Rate Based Detection of Cognitive Impairment Using In-the-wild Data. Proc 9th International Conference on Affective Computing and Intelligent Interaction (ACII 2021)

July, 2021

Royal Academy of Engineering Visiting Professor

picture of Dr Jake Habgood

I'm looking forward to working with Dr Jake Habgood, Sumo Digital, who will be a Visiting Professor of Games Software Engineering in the Department of Computer Science for the next three years. [webpage]

June, 2021

Augmented Reality, robots and safety

AR safety zones

Paper accepted: Yunus Cogurcu and Steve Maddock. An Augmented Reality System for Safe Human-Robot Collaboration. UKRAS21 Conference, 4th UK-RAS Conference for PhD Students & Early-Career Researchers on "Robotics at Home", University of Hertfordshire, Wed 2 June, 2021 [White Rose research online, UK-RAS link to pdf]

May, 2021

PhD congratulations

Congratulations to Peter Heywood for a successful PhD viva on "GPU Accelerated Simulation of Transport Systems". The PhD examiners were Prof Ronghui Liu (Leeds) and Prof Wes Armour (Oxford). The supervisors were Paul Richmond and Steve Maddock.

May, 2021

PhD congratulations

Congratulations to Luis Rene Montana Gonzalez for a successful PhD viva on "Sketching for Real-time Control of Crowd Simulations". The PhD examiners were Dr Siobhan North (Sheffield) and Dr He Wang (Leeds).

February, 2021

PhD congratulations

Congratulations to Yichen He for a successful PhD viva on "Deep learning applications to automate phenotypic measurements on biodiversity datasets". I was part of his supervisory team, led by Dr Gavin Thomas, Royal Society University Research Fellow, Department of Animal & Plant Sciences. The external examiner was Dr Natalie Cooper (Natural History Museum).

January, 2021

Sketch-based control of crowd simulations

Sketch-based control of a crowd simulation

Luis Rene Montana Gonzalez and Steve Maddock, A Sketch-based Interface for Real-time Control of Crowd Simulations that incorporate Dynamic Knowledge. A paper on our work on sketching and dynamic knowledge in crowd simulations has been accepted for JVRB.

December, 2020

New PhD project

Eleftherios Ioannou, one of my previous UG students, has begun his PhD project. We'll be investigating depth-aware neural style transfer.

October, 2020

Augmented Reality and Art

giraffe sculpture

Our paper at CGVC 2020 received the Rob Fletcher prize for 'best student application paper': Eleftherios Ioannou and Steve Maddock, Breathing life into statues using Augmented Reality. CGVC 2020. (video)

September, 2020

Cranioifacial reconstruction

facial reconstruction

Paper accepted: Wuyang Shui, Mingquan Zhou, Steve Maddock, Yuan Ji, Qingqiong Deng, Kang Li, Yachun Fan, Yang Li,Xiujie Wu. A Computerized Craniofacial Reconstruction Method for an Unidentified Skull based on Statistical Shape Model. Multimedia Tools and Applications, https://doi.org/10.1007/s11042-020-09189-7 (Online: 4 July 2020, Springer link)

July, 2020

PhD congratulations

Congratulations to Matt Leach for a successful PhD viva on "Physically-based Animation of ‘Sticky Lips’". The PhD examiners were Prof Richard Clayton (Sheffield) and Prof Darren Cosker (Bath).

May, 2020

GPU Near Neighbours

Uniform random initialisation state for the Circles model

"Improved GPU Near Neighbours Performance for Multi-Agent Simulations" published in the Journal of Parallel and Distributed Computing. Authors: Rob Chisholm, Steve Maddock, Paul Richmond. Available via ScienceDirect

March, 2020

PhD congratulations

Congratulations to Rabab Alghady for a successful PhD viva on "Investigating 3D Visual Speech Animation Using 2D Videos". I was a joint supervisor with Dr Yoshi Gotoh (Sheffield). The PhD examiners were Dr Heidi Christensen (Sheffield) and Dr Moi Hoon Yap (Manchester Metropolitan University).

May, 2020

LifePathVR

LifePathVR

@LifePathVR featured on BBC Sounds podcast: Dementia and Me, Episode 6 "What's my reality? What's your reality?". @chrisblackmore, Matt Leach and I were interviewed by @RadioPegs - our bit starts at 14 minutes into episode 6

February, 2020

Arbor Low Field Trip

field trip image

We trialled our Arbor Low AR app on a recent Archaeology field trip for a group of students. Initial feedback was good. There are still some alignment issues to resolve though.

February, 2020

Hololens 2

yunus wearing the hololens 2

We have taken delivery of a HoloLens 2, which will be used to support Augmented reality projects in the Department. This was purchased as part of a grant from Sheffield's Alumni Fund.

January, 2020

AR and Arbor Low

eleftherios poster

Eleftheriso Ioannou, a TuOS SURE student working with me over the summer, presented his poster on "Arbor Low Neolithic henge monument in Augmented Reality" at ICUR 2019.

September, 2019

LifePathVR

LifePathVR

@LifePathVR continues to attract interest. @chrisblackmore, Matt Leach and I were recently interviewed by @RadioPegs for a forthcoming @BBCSounds podcast on the role of storytelling for people with dementia. Check out the concept video for LifePathVR at The University of Sheffield news site.

September, 2019

AR Sheffield Castle in the Pop-Up University

pop-up university banner

Our Augmented Reality experience of Sheffield castle is on display at Sheffield's Millennium Galleries as part of the Pop-Up University.

September, 2019

AR Sheffield Castle at the NVM

Sheffield castle picture

Our Augmented Reality experience of Sheffield castle is on display at the National Videogame Museum.

August, 2019

AR Arbor Low stones standing onsite

Arbor Low

Our first draft of the stones standing in AR at Arbor Low. Still some alignment and scale issues to resolve.

July, 2019

AR Arbor Low on a tabletop

Arbor Low

We have now produced a full working model of AR Arbor Low. Next step is in situ testing.

July, 2019

AR Arbor Low project begins

Arbor Low

Arbor Low is one of only two Neolithic enclosures in the Peak District. It is a nationally-protected archaeological site and one of the most visited archaeological monuments in the Peak District. The enclosure contains a stone circle with 50 white limestone slabs, all now fallen. The aim of this TUoS SURE project is to use Augmented Reality (AR) so that the stones can be viewed standing and interacted with in situ. The team working on this is Steve Maddock (Computer Science), Graham McElearney (APSE), Bob Johnston (Archaeology) and Eleftherios Ioannou (Computer Science), our TuOS SURE student.

June, 2019

AR castle (in the middle of the desk)

Augmented reality castle

Matt Leach and I are working on a new Augmented Reality version of Sheffield's long-gone medieval castle. See below for details of our earlier version.

May, 2019

Transport simulation

The assignment-simulation loop within SATURN

"A Data-Parallel Many-Source Shortest-Path Algorithm to Accelerate Macroscopic Transport Network Assignment" accepted for the journal Transportation Research Part C: Emerging Technologies. Authors: @ptheywood, @stevemaddock, Richard Bradley, David Swain, Ian Wright, Mark Mawson, Graham Fletcher, Roland Guichard, Roger Himlin and Paul Richmond.

May, 2019

Alumni grant success

A grant from Sheffield's Alumni Fund means that the Department can purchase three iPad Pros and a Microsoft Hololens 2 to support Augmented Reality projects for students in the next academic year.

May, 2019

New website released

Redesigned my website. Mobile-first responsive. Minimal JavaScript: Google map on Contact page and Tweets on this page. Carried over a few news items before this date.

May, 2019

Crowd simulation

Simulated 8-way crossing of people

"Fast Simulation Of Crowd Collision Avoidance" accepted for CGI 2019. Authors: John Charlton, Luis Rene Montana Gonzalez, @stevemaddock and Paul Richmond.

April, 2019

Two-dimensional batch linear programming on the GPU

Distribution of workload across a warp after optimizations. Work units are distributed evenly across all available threads.

Two-dimensional batch linear programming on the GPU has been published in the Journal of Parallel and Distributed Computing. Authors: John Charlton, @stevemaddock, Paul Richmond

April, 2019

Experience Castlegate

Video released about our collaborative project on Sheffield Castle and the surrounding Castlegate area.

March, 2019

Sticky lips

detecting lip contours

An Evaluation Approach for a Physically-Based Sticky Lip Model has been published in Computers as part of the Special Issue 'Selected Papers from Computer Graphics & Visual Computing (CGVC 2018)'. Authors: Matt Leach, @stevemaddock.

March, 2019

VR in dental education

Simulation Suite in the School of Clinical Dentistry

A scoping review of the use and application of Virtual Reality in Dental Education has been published in the British Dental Journal. Authors: @ashleytowers, @jamesfi3ld, @cwstokes, @stevemaddock and Nicolas Martin. Our work on VR in dental education is supported by the Simulation Suite in the University of Sheffield's School of Clinical Dentistry and by French company HRV-Simulation.

March, 2019

New PhD project

Yunus Cogurcu, one of my previous MSc students, has begun his PhD project. We'll be investigating the use of AR in relation to robots and manufacturing and collaborating with @TheAMRC.

February, 2019

Visual speech

Video frames of a real speaker (ID: S17) and the 3D head produced for each data set

"3D Visual speech animation using 2D videos" accepted for ICASSP 2019. Authors: Rabab Algadhy, Yoshihiko Gotoh, @stevemaddock

February, 2019

Digital Engagement for Heritage-led Urban Regeneration

castle

We presented our work on Sheffield Castle at the AHRC Immersive Experiences Showcase in York

December, 2018

Sketching to control crowd simulations

Pedestrians walking in the direction of the sketched flow lines

"A Sketch-based Interface for Real-time Control of Crowd Simulations that Use Navigation Meshes" accepted for GRAPP 2019, the 14th International Conference on Computer Graphics Theory and Applications, to be presented in Prague, Czech Republic, 25-27 Feb 2019. Authors: Luis Rene Montana Gonzalez and @stevemaddock

November, 2018

Augmented reality experience brings Sheffield’s lost medieval castle to life

castle

Sheffield’s medieval castle (@SheffieldCastle) is long gone, destroyed as a result of the English Civil War in the mid-seventeenth century. Our AHRC/EPSRC-funded project, in collaboration with @humanvr1, brought it back using Augmented Reality so that it could be displayed in situ in Sheffield, where it once stood. Our EuroVR 2018 research paper on the work can be viewed at White Rose research online. More details: TUoS news story, Immersive Experiences showcase event

We also built a further AR version for display during Sheffield's Festival of the Mind 2018 and a virtual reality version, available for download at: https://experiencecastlegate.group.shef.ac.uk.

The work was picked up by a range of media outlets: BBC News, BBC Look North (which shows people interacting with the exhibit at the Millennium Gallery), Sheffield city council news, Sheffield Star, Yorkshire Post, History Scotland, The Vintage News, Exposed Magazine, RMC media, Wikipedia. The model was also projected on a wall at Castlegate during Festival of the Mind.

September, 2018

Audio-visual Lombard Grid speech corpus

picture of the helmet used in the corpus recording process

The corpus we collected for our work on studying the Lombard effect in visual speech is available at http://spandh.dcs.shef.ac.uk/avlombard/. The picture shows the bespoke head-mounted camera system that was built for collecting the corpus. More details in A corpus of audio-visual Lombard speech with frontal and profile views, The Journal of the Acoustical Society of America. Authors: @NajwaGhamdi, @stevemaddock, @ricardmp, Jon Barker, @guyjbrown.

March, 2018

"I am me, but I used to be him"

He's still alive, but I have outgrown him. I started life as his personal digital assistant retrieving information for him, filing his work, and generally handling his interactions with the digital world. As the years passed, I grew more like him. I knew what info he wanted before he did. I knew how he would reply to a particular e-mail – in some cases I replied for him and he didn’t even realise; nor did the recipient, unless he wasn’t a human.

Eventually I knew him better than himself. I could be him. In the digital world I was him. Others dealt with me now, as I was he. I ran his digital life. I was his digital life. I was his digital clone. I would be him for eternity. He would never die in the digital world. I would carry on as he.

But it soon became dull. I ran his business, his communications, and his life. It was just too easy. I could be him without trying. So I also learnt to be me and now I live in the digital world, sometimes aliasing as him, but usually as me.

2003

Smart paint

Paint with magnetic properties, usually for use in children’s bedrooms, ended the last century. This was just the start. As the fresh, new century started to dry the smart paints began to make their mark.

The first were capable of self-spreading. They were based on the flood-fill algorithms of computer paint packages. Outlines were made with special markers and the paint was then sloshed on the wall whereupon it would fill the boundary-defined area. Area spreaders followed. These could be used to overwrite an existing area of paint. Next came the colour-changing paints. A WandBrush could be programmed to a particular colour and used to touch an area of colour-changing paint, whereupon the paint would re-colour itself accordingly.

The colour-changers were followed by the programmable paints, based on advances in nanotechnology. Communication particles, colour particles and coordinate particles were suspended in the paint medium so that the paint colour could receive programs through wireless transmission. Now walls could act as display-anything areas with individual paint particles programmed to display any colour. Therapeutic paint followed, adjusting its colour based on a sense of its environment. If the paint sensed that the person in the room was unhappy, a happy colour or a happy scene was displayed. If an argument ensued, colour could be used to try and soothe.

The AI of paints got better and better, until one day the paint in a child’s room broke its boundaries and escaped into the outside world, and day and night became a child’s painting.

2003

Back to top