The Liverpool-York Head Model

Hang Dai, Nick Pears, Will Smith, Department of Computer Science, University of York, UK

Christian Duncan, Alder Hey Craniofacial Unit, Liverpool, UK

The Liverpool-York Head Model (LYHM) project is a collaboration between the Craniofacial Unit at Alder Hey Hospital, Liverpool (UK) and the University of York (UK), Department of Computer Science. The collaboration aims to build 3D models of human face and cranium variation in order to support clinical planning and surgical intervention evaluation tools for craniofacial surgeons. This project is a partner project of the Headspace Dataset Project, which employed 3dMD's static 3dMDhead scanning system to capture 1519 3D images of the human head (this is filtered to 1212 images for our global model build). Work in 2017-18 was supported by Google Faculty Awards in the 'Headspace Online' project, with sponsor Forrester Cole (Google, Cambridge, MA) and this allows university-based researchers to download the dataset, see the Headspace Dataset Project web page.

The image below shows the cranial extent of our head models compared to the well-known Basel Face Model (BFM) and the Large Scale Face Model (LSFM) developed by the iBUG group at Imperial, UK.

LYHM: public release date: 22nd October 2017

The head models have been freely available for research and education purposes from the date of ICCV 2017 (Oct 22nd 2017). To obtain access to the models (and/or the associated Headspace training data), you need to complete and sign the user agreement form. This agreement should be completed by an academic staff member (not a student). The form should be signed, and emailed to Nick Pears (nick.pears@york.ac.uk). We will verify your request and contact you on how to download the model package. Note that the agreement requires that

  1. The models are used for non-commercial research and education purposes only.
  2. You agree not copy, sell, trade, or exploit the model for any commercial purposes.
  3. In any published research using the models, you cite the following paper:

The model download will include:

  1. 3D Morphable Head Models (both shape and texture): several versions of model build will be made available, such as general population and children only.
  2. MATLAB code to display the model and vary its shape along the principal modes of shape variation.
  3. Information describing the model structure and the model build.

A screen shot of the MATLAB GUI and the head shape controlled with the slider set to -3 SD for mode 1 are shown below:

Animated head models constructed from over 1200 subjects

Our global models are constructed from over 1200 subjects with an even split of male and females. This work is to be presented at ICCV 2017. The first movie shows animation of the shape model only.

We have also modelled texture in the dataset. Note that subjects are wearing tight fitting latex caps (the cap colour on any subject is either white or blue).

Using the model to perform age regression (age in years given)

Publications

3D morphable models

  1. Statistical Modeling of Craniofacial Shape and Texture
      H. Dai, N. E. Pears, W. Smith and C. Duncan
      International Journal of Computer Vision, 128(2), pp 547-571
      [Springer PDF Link][DOI][BibTeX]

  2. Symmetric Shape Morphing for 3D Face and Head Modelling
      H. Dai, N. E. Pears, W. Smith and C. Duncan
      Proc. 2018 IEEE Int. Conf. Automatic Face and Gesture Recognition (FG 2018)

  3. A Data-augmented 3D Morphable Model of the Ear
      H. Dai, N. E. Pears and W. Smith
      Proc. 2018 IEEE Int. Conf. Automatic Face and Gesture Recognition (FG 2018)

  4. A 3D Morphable Model of Craniofacial Shape and Texture Variation
      H. Dai, N. E. Pears, W. Smith and C. Duncan
      Proc. 2017 Int. Conf. Computer Vision (ICCV 2017)
      [PDF][BibTeX]

  5. Symmetry-factored Statistical Modelling of Craniofacial Shape
      H. Dai, W. Smith, N. E. Pears, and C. Duncan
      Proc. 2017 PeopleCap Workshop, Int. Conf. Computer Vision (ICCV 2017) pp. 786-794
      [PDF][BibTeX]

2D morphable models

  1. Modelling of Orthogonal Craniofacial Profiles
      H. Dai, N. E. Pears and C. Duncan
      Journal of Imaging, vol 3, number 55 (2017)
      [PDF][DOI][BibTeX]

  2. A 2D Morphable Model of Craniofacial Profile and Its Application to Craniosynostosis
      H. Dai, N. E. Pears and C. Duncan
      Proc. 2017 Conf. Medical Image Understanding and Analysis (MIUA 2017).
      Communications in Computer and Information Science, pages 731-742, vol 723. Springer, Cham
      [DOI]

Clinical abstracts

  1. A Morphable Model Of the Human Head Vaidating the Outcomes of An Age-Dependant Scaphocephaly Correction
      B. Robertson, H. Dai, N. E. Pears and C. Duncan
      Abstract in International Journal of Oral and Maxillofacial Surgery (vol 46, p68, 2017).
      [DOI]

  2. A Morphable Profile Model of the Human Head as an Outcome Tool for Craniosynostosis Surgery
      Christian Duncan, Rachel Armstrong (Alder Hey Hospital, Liverpool) and Nick E. Pears
      16th Biennial Congress of the International Society of Craniofacial Surgery (ISCFS), Tokyo Bay, Japan, September 14th - 18th, 2015.
      Poster abstract P74 (page e91) in British Journal of Oral and Maxillofacial Surgery 54 (2016).
      [DOI]

Inital arXiv publication

Our arXiv paper details our early 100-subject prototype models that were developed in 2015, and can be found here.

  1. Automatic 3D modelling of craniofacial form
      N. E. Pears and C. Duncan
      arXiv, 21st Jan 2016
      [arXiv]

Project sponsors

We thank the funding bodies that have supported this research which include

Publicity and prizes


BACK to Nick Pears' Research Projects page.