Yuxin Cui, Zijian Huang, Siyuan Peng, Mingrui Tang and Ye Zhao completed their dissertation projects with me. Yuxin's work on "Particle System in AR Game" produced a smartphone AR game that uses particle systems to handle shooting and explosions in combat with a monster which spawns at a location in the real world. Zijian's work on "Constructing a life path" investigated how to construct a series of paths on a 2D plane, each decorated with materials and surrounding environments, which could then be walked along in 3D using a VR headset. Siyuan's work on "Making cat faces" developed a smartphone app that animated a cat's face by mimicking the movement of a human face captured useing the smartphone's camera - a particular aspect of this was how to control the cat's ears and whiskers. Mingrui's work on "Variation on a facial expression" investigated how to visualise the difference between facial expressions. Ye's work "A VR game using Particle System" created a VR game that uses particle systems for modelling different types of bioluminescence.
September, 2024
Visiting Professor: three more years
Dr Jake Habgood, Director of Education Partnerships for the Sumo Digital Group, will be a Visiting Professor of Games
Software Engineering in the School of Computer Science from 2024 to 2027 [webpage]. This follows on from a successful three-year Royal Academy of Engineering Visiting Professor post that finished recently. We will be continuing our work on the use of web-based portfolios for job applications in the games industry - see the example portfolio hosted by Sumo Academy which we developed during the RAE VP time. A Google Site guide to this portfolio and to using portfolios will be made live at the start of the next academic year. In addition, over the next three years Jake will be working with our students on games-related activities in our degree programmes and also advising students developing games in extracurricular activities.
August, 2024
Neural Style Transfer
Paper published: Eleftherios Ioannou, Steve Maddock (2024), Evaluation in Neural Style Transfer: A Review,
Computer Graphics Forum [Wiley open access]
Jameel Chagpar, Chengyu Fu, Aden Manson and George Smith completed their undergraduate dissertation projects with me. Jameel's work on "Exaggerating videos of facial animation", used the 'Audio-visual Lombard Grid speech corpus' to investigate how mouth movement could be exaggerated in videos, with a potential application in helping second language learners more easily understand visual speech in videos. Chengyu's work on "An Investigation of Normal Maps and Level-of-detail for Polygon Meshes" investigated how to use image quality metrics (SSIM, PSNR, MSE, LPIPS) to judge the quality of rendering results when a high-detail polygon mesh is decimated and rendered using normal maps. Aden developed a "VR Volcanology Game" using Meta Quest 2 and 3 for use by Dr Tom Pering in teaching Geography students. George's work on "Curating an art exhibition using a Hololens 2" involved the devlopment of a Hololens 2 GUI for arranging virtual paintings on the walls of a room.
May, 2024
Neural Style Transfer for Computer Games
Paper published: Eleftherios Ioannou, Steve Maddock (2024), Towards Real-time G-buffer-Guided Style Transfer in Computer Games, IEEE Transactions on Games [IEEE Explore]
March, 2024
Liquid-Fabric Interaction
Rob presented the paper: Rob Dennison and Steve Maddock (2024). Using The Polynomial Particle-in-Cell Method for Liquid-Fabric Interaction. In Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 1: GRAPP, pages 244-251 (Rome, 27-29 Feb, 2024) [pdf]
February, 2024
Grant success
'Unlocking the complexity of organismal colour patterns using artificial intelligence'; Investigators: Cooney (PI, Biosciences), Thomas (Biosciences), Maddock (Computer Science), Han (Computer Science); (£287,290). Research partners: University of Sheffield, UK and University of Montpellier, France
November, 2023
PhD congratulations
Congratulations to Yunus Cogurcu for a successful PhD viva on "Augmented Reality for Safe Human-Robot Collaboration". The PhD examiners were Prof. John Ahmet Erkoyuncu (Cranfield University) and Dr Po Yang (Sheffield). Supervisor: Dr Steve Maddock. In collaboration with Sheffield Robotics and Jonathan Eyre, AMRC
Congratulations to Feixiang He at the University of Leeds. I was the external examiner for his PhD on 'Advancing High-Fidelity Crowd Simulation: From Behavior to Environment Layout'. The internal examiner was Dr Nick Malleson and the supervisors were Prof He Wang and Prof David Hogg.
September, 2023
PhD congratulations
Congratulations to Fatimah Alzahrani for a successful PhD viva on "The Effectiveness of Facial Cues for
Automatic Detection of Cognitive Impairment". The PhD examiners were Prof Moi Hoon Yap (Manchester Metropolitan
University) and Dr Po Yang (Sheffield). The supervisors were Prof Heidi Christensen and Dr Steve Maddock.
September, 2023
PhD congratulations
Congratulations to John Charlton for a successful PhD viva on "Constraint-Based Simulation of Virtual
Crowds". The PhD examiners were Dr Peter Lawrence (Greenwich) and Dr Dawn Walker (Sheffield). The supervisors
were
Prof Paul Richmond and Dr Steve Maddock.
July, 2023
SDC 2023
The Sumo Developer Conference came to Sheffield: over 1000 people from 11 Sumo Digital studios across the UK,
Europe and India. Some of our students, together with
students from Hallam University and apprentices from Sumo Digital
Academy presented their work in a demo session at the start of the conference.
June, 2023
AR, Robots and Safety
Paper published: Yunus Cogurcu, James Douthwaite and Steve Maddock, A Comparative Study of Safety Zone
Visualisations for Virtual and Physical Robot Arms Using Augmented Reality, Computers, 2023, 12(4), 75; https://doi.org/10.3390/computers12040075
Paper published: Eleftherios Ioannou and Steve Maddock, Depth-Aware Neural Style Transfer for Videos,
Computers 2023, 12(4), 69; https://doi.org/10.3390/computers12040069
April, 2023
AR and safe human-robot collaborationn
Yunus presented the paper: Yunus Cogurcu and Steve Maddock (2023), Augmented Reality Safety Zone Configurations in
Human-Robot Collaboration: A User Study. ACM/IEEE International Conference on Human-Robot Interaction (HRI
2023), March 13-16, 2023 Stockholm, SE, late-breaking report
March, 2023
Identifying regions and points on natural history specimens
Paper accepted: Blayze Millward, Steve Maddock, Michael Mangan, CompoundRay, an open-source tool for
high-speed and high-fidelity rendering of compound eyes [eLife]
October, 2022
The Biophilic Garden
Our AR app is in use as part of the Biophilic Garden as part of Festival of the Mind 2022. This was developed
as a result of a SURE project at the University of Sheffield: Benedict Barrow developed the app in summer 2022
under the supervision of Steve Maddock. The AR app allows visitors to explore a connection with the natural
world via 5 pathways - compassion, beauty, emotion, meaning and senses. Team: Dr Chris Blackmore, School of
Health and Related Research; Richard Nicolle, Garden Up; Benedict Barrow, Department of Computer Science;
Steve Maddock, Department of Computer Science; Professor Miles Richardson, School of Psychology, University of
Derby; Andrew Hall, 3D Artist. [link]
September, 2022
Neural Style Transfer
Our paper at CGVC 2022 received the Terry Hewitt Prize for 'Best Student Technical Paper': Eleftherios
Ioannou and Steve Maddock, Depth-aware Neural Style Transfer using Instance Normalization [pdf (White Rose Research Online)]
September, 2022
AR, Robots and Safety
Our paper at CGVC 2022 received the Rob Fletcher prize for 'best student application paper': Yunus Cogurcu,
James Douthwaite and Steve Maddock, Augmented Reality for Safety Zones in Human-Robot Collaboration [pdf (White Rose Research Online)]
September, 2022
Nature Communications paper
He Y, Varley Z, Moody C, Nouri L, Jardine M, Maddock S, Thomas G, Cooney C. Deep learning image segmentation
reveals patterns of UV reflectance evolution in passerine birds. Nature Communications, 13, Article number
5068, 2022 [pdf]
August, 2022
Cultural Industries Research Network Launch
I gave an invited presentation "Computer Science and the Video Games Industry: A collaboration" at the launch
of the University of Sheffield Cultural Industries Research Network. [link]
July, 2022
SDC 2022
The Sumo Developer Conference came to Sheffield: "700+ people from 11 Sumo Digital studios across the UK,
Europe and India" [SDC]. Some of our students, together with
students from Hallam University and apprentices from Sumo Digital
Academy presented their work in a poster demo at the start of the conference.
June, 2022
Video analytics
Paper published: Tianhao Zhang, Aftab Waqas, Lyudmila Mihaylova, Christian Langran-Wheeler, Samuel Rigby,
David Fletcher, Steve Maddock, Garry Bosworth (2022). Recent Advances in Video Analytics for Rail Network
Surveillance for Security, Trespass and Suicide Prevention - A Survey. Sensors [pdf]
June, 2022
Sumo visit
I took a group of students to Sumo's offices in Sheffield
for a talk by Dr Jake Habgood and a tour of the Sumo Digital Academy offices.
April, 2022
In2stempo
Our work package for the Horizon 2020 project In2Stempo with Network Rail, EU Shift2Rail programme has
finished. Team: Fletcher (PI, MechEng), Rigby (Civil), Mihaylova (ACSE), Maddock (COM). As part of this work
Harley Everett worked with Steve on creating 3D pdf documents for visualisation of blast simulations in a
railway station.
March, 2022
Eye Blink Rate
Paper published: Fatimah Alzahrani, Bahman Mirheidari, Daniel Blackburn, Steve Maddock and Heidi Christensen
(2021). Eye Blink Rate Based Detection of Cognitive Impairment Using In-the-wild Data. Proc 9th International
Conference on Affective Computing and Intelligent Interaction (ACII 2021) [pdf (White Rose Research Online)]
September, 2021
CompoundRay
"CompoundRay is capable of accurately rendering the visual perspective of a desert ant at over 5,000 frames
per second in a 3D mapped natural environment": Blaze Millward, Steve Maddock and Michael Mangan. CompoundRay:
An open-source tool for high-speed and high-fidelity rendering of compound eyes [bioRxiv]
September, 2021
Sketch-based control of crowd simulations
Paper published: Luis Rene Montana Gonzalez and Steve Maddock, A Sketch-based Interface for Real-time Control
of Crowd Simulations that incorporate Dynamic Knowledge. Journal of Virtual Reality and Broadcasting,
16(2019), no. 3. [pdf].
August, 2021
UK-RAS #Manufacturing #Robotics Challenge 2021
Yunus Cogurcu used his PhD knowledge of VR, robotics and safety to help create the simulation used in the
UK-RAS #Manufacturing #Robotics Challenge 2021: [YouTube]
July, 2021
Deep learning and bird plumage
Some of the work Yichen He did for his PhD thesis on using deep learning for segmenting bird plumage is being
written up for publication: see bioRxiv
July, 2021
Facial features and cognitive impairment detection
Paper accepted: Fatimah Alzahrani, Bahman Mirheidari, Daniel Blackburn, Steve Maddock, and Heidi Christensen.
Eye Blink Rate Based Detection of Cognitive Impairment Using In-the-wild Data. Proc 9th International
Conference on Affective Computing and Intelligent Interaction (ACII
2021)
July, 2021
Royal Academy of Engineering Visiting Professor
I'm looking forward to working with Dr Jake Habgood, Sumo Digital, who will be a Visiting Professor of Games
Software Engineering in the Department of Computer Science for the next three years. [webpage]
June, 2021
Augmented Reality, robots and safety
Paper accepted: Yunus Cogurcu and Steve Maddock. An Augmented Reality System for Safe Human-Robot
Collaboration. UKRAS21 Conference, 4th UK-RAS Conference for PhD Students & Early-Career Researchers on
"Robotics at Home", University of Hertfordshire, Wed 2 June, 2021 [White Rose research online, UK-RAS link to pdf]
May, 2021
PhD congratulations
Congratulations to Peter Heywood for a successful PhD viva on "GPU Accelerated Simulation of Transport
Systems". The PhD examiners were Prof Ronghui Liu (Leeds) and Prof Wes Armour (Oxford). The supervisors were
Paul Richmond and Steve Maddock.
May, 2021
PhD congratulations
Congratulations to Luis Rene Montana Gonzalez for a successful PhD viva on "Sketching for Real-time Control
of Crowd Simulations". The PhD examiners were Dr Siobhan North (Sheffield) and Dr He Wang (Leeds).
February, 2021
PhD congratulations
Congratulations to Yichen He for a successful PhD viva on "Deep learning applications to automate phenotypic
measurements on biodiversity datasets". I was part of his supervisory team, led by Dr Gavin Thomas, Royal
Society University Research Fellow, Department of Animal & Plant Sciences. The external examiner was Dr
Natalie Cooper (Natural History Museum).
January, 2021
Sketch-based control of crowd simulations
Luis Rene Montana Gonzalez and Steve Maddock, A Sketch-based Interface for Real-time Control of Crowd
Simulations that incorporate Dynamic Knowledge. A paper on our work on sketching and dynamic knowledge in
crowd simulations has been accepted for JVRB.
December, 2020
New PhD project
Eleftherios Ioannou, one of my previous UG students, has begun his PhD project. We'll be investigating
depth-aware neural style transfer.
Paper accepted: Wuyang Shui, Mingquan Zhou, Steve Maddock, Yuan Ji, Qingqiong Deng, Kang Li, Yachun Fan, Yang
Li,Xiujie Wu. A Computerized Craniofacial Reconstruction Method for an Unidentified Skull based on Statistical
Shape Model. Multimedia Tools and Applications, https://doi.org/10.1007/s11042-020-09189-7 (Online: 4
July 2020, Springer link)
July, 2020
PhD congratulations
Congratulations to Matt Leach for a successful PhD viva on "Physically-based Animation of ‘Sticky Lips’". The
PhD examiners were Prof Richard Clayton (Sheffield) and Prof Darren Cosker (Bath).
May, 2020
GPU Near Neighbours
"Improved GPU Near Neighbours Performance for Multi-Agent Simulations" published in the Journal of Parallel
and Distributed Computing. Authors: Rob Chisholm, Steve Maddock, Paul
Richmond. Available via ScienceDirect
March, 2020
PhD congratulations
Congratulations to Rabab Alghady for a successful PhD viva on "Investigating 3D Visual Speech Animation Using
2D Videos". I was a joint supervisor with Dr Yoshi Gotoh (Sheffield). The PhD examiners were Dr Heidi
Christensen (Sheffield) and Dr Moi Hoon Yap (Manchester Metropolitan University).
May, 2020
LifePathVR
@LifePathVR featured on BBC Sounds podcast: Dementia and Me, Episode 6 "What's my reality? What's
your reality?". @chrisblackmore, Matt Leach and I were
interviewed by @RadioPegs - our bit starts at 14 minutes into
episode 6. Also see: tweet
February, 2020
Arbor Low Field Trip
We trialled our Arbor Low AR app on a recent Archaeology field trip for a group of students. Initial feedback
was good. There are still some alignment issues to resolve though.
February, 2020
Hololens 2
We have taken delivery of a HoloLens 2, which will be used to support Augmented reality projects in the
Department. This was purchased as part of a grant from Sheffield's Alumni Fund.
January, 2020
AR and Arbor Low
Eleftheriso Ioannou, a TuOS SURE student working with me
over the summer, presented his poster on "Arbor Low Neolithic henge monument in Augmented Reality" at ICUR 2019.
Our first draft of the stones standing in AR at Arbor
Low. Still some alignment and scale issues to resolve.
July, 2019
AR Arbor Low on a tabletop
We have now produced a full working model of AR Arbor
Low. Next step is in situ testing.
July, 2019
AR Arbor Low project begins
Arbor
Low is one of only two Neolithic enclosures in the Peak District. It is a nationally-protected
archaeological site and one of the most visited archaeological monuments in the Peak District. The enclosure
contains a stone circle with 50 white limestone slabs, all now fallen. The aim of this TUoS SURE project is to
use Augmented Reality (AR) so that the stones can be viewed standing and interacted with in situ. The team
working on this is Steve Maddock (Computer Science), Graham McElearney (APSE), Bob Johnston (Archaeology) and
Eleftherios Ioannou (Computer Science), our TuOS SURE student.
June, 2019
AR castle (in the middle of the desk)
Matt Leach and I are working on a new Augmented Reality version of Sheffield's long-gone medieval castle. See
below for details of our earlier version.
May, 2019
Transport simulation
"A Data-Parallel Many-Source Shortest-Path Algorithm to Accelerate Macroscopic Transport Network Assignment"
accepted for the journal Transportation
Research Part C: Emerging Technologies. Authors: @ptheywood,
@stevemaddock, Richard Bradley, David Swain, Ian Wright, Mark
Mawson, Graham Fletcher, Roland Guichard, Roger Himlin and Paul Richmond.
May, 2019
Alumni grant success
A grant from Sheffield's Alumni Fund means that
the Department can purchase three iPad Pros and a Microsoft Hololens 2 to support Augmented Reality projects
for students in the next academic year.
May, 2019
New website released
Redesigned my website. Mobile-first responsive. Minimal JavaScript: Google map on Contact page and Tweets on this page. Carried over a few news
items before this date.
May, 2019
Crowd simulation
"Fast Simulation Of Crowd Collision Avoidance" accepted for CGI 2019. Authors: John Charlton, Luis Rene
Montana Gonzalez, @stevemaddock and Paul Richmond.
April, 2019
Two-dimensional batch linear programming on the GPU
Yunus Cogurcu, one of my previous MSc students, has begun his PhD project. We'll be investigating the use of
AR in relation to robots and manufacturing and collaborating with @TheAMRC.
February, 2019
Visual speech
"3D Visual speech animation using 2D videos" accepted for ICASSP 2019. Authors: Rabab Algadhy, Yoshihiko
Gotoh, @stevemaddock
February, 2019
Digital Engagement for Heritage-led Urban Regeneration
"A Sketch-based Interface for Real-time Control of Crowd Simulations that Use Navigation Meshes" accepted for
GRAPP 2019, the 14th International Conference on Computer Graphics Theory and Applications, to be presented in
Prague, Czech Republic, 25-27 Feb 2019. Authors: Luis Rene Montana Gonzalez and @stevemaddock
November, 2018
Augmented reality experience brings Sheffield’s lost medieval castle to life
Sheffield’s medieval castle (@SheffieldCastle) is long
gone, destroyed as a result of the English Civil War in the mid-seventeenth
century.
Our AHRC/EPSRC-funded project, in collaboration with @humanvr1,
brought it back using Augmented Reality so that it could be displayed in situ in
Sheffield, where it once stood. Our EuroVR 2018 research paper on the work can be viewed
at White Rose research online. More details: TUoS
news story, Immersive Experiences showcase
event
He's still alive, but I have outgrown him. I started life as his personal digital assistant retrieving
information for him, filing his work, and generally handling his interactions with the digital world. As the
years passed, I grew more like him. I knew what info he wanted before he did. I knew how he would reply to a
particular e-mail – in some cases I replied for him and he didn’t even realise; nor did the recipient, unless
he wasn’t a human.
Eventually I knew him better than himself. I could be him. In the digital world I was him. Others dealt with
me now, as I was he. I ran his digital life. I was his digital life. I was his digital clone. I would be him
for eternity. He would never die in the digital world. I would carry on as he.
But it soon became dull. I ran his business, his communications, and his life. It was just too easy. I could
be him without trying. So I also learnt to be me and now I live in the digital world, sometimes aliasing as
him, but usually as me.
2003
Smart paint
Paint with magnetic properties, usually for use in children’s bedrooms, ended the last century. This was
just the start. As the fresh, new century started to dry the smart paints began to make their mark.
The first were capable of self-spreading. They were based on the flood-fill algorithms of computer paint
packages. Outlines were made with special markers and the paint was then sloshed on the wall whereupon it
would fill the boundary-defined area. Area spreaders followed. These could be used to overwrite an existing
area of paint. Next came the colour-changing paints. A WandBrush could be programmed to a particular colour
and used to touch an area of colour-changing paint, whereupon the paint would re-colour itself accordingly.
The colour-changers were followed by the programmable paints, based on advances in nanotechnology.
Communication particles, colour particles and coordinate particles were suspended in the paint medium so that
the paint colour could receive programs through wireless transmission. Now walls could act as display-anything
areas with individual paint particles programmed to display any colour. Therapeutic paint followed, adjusting
its colour based on a sense of its environment. If the paint sensed that the person in the room was unhappy, a
happy colour or a happy scene was displayed. If an argument ensued, colour could be used to try and soothe.
The AI of paints got better and better, until one day the paint in a child’s room broke its boundaries and
escaped into the outside world, and day and night became a child’s painting.