Evolving Mazes from Images

Liang Wan         Xiaopei Liu         Tien-Tsin Wong       Chi-Sing Leung

IEEE Transactions on Visualization and Computer Graphics, Vol. 16, No. 2, March/April 2010, pp. 287-297.



We propose a novel reaction diffusion (RD) simulator to evolve image-resembling mazes. The evolved mazes faithfully preserve the salient interior structures in the source images. Since it is difficult to control the generation of desired patterns with traditional reaction diffusion, we develop our RD-simulator using a different computational platform, cellular neural networks. Based on the proposed simulator, we can generate the mazes that exhibit both regular and organic look, or the mazes that range from uniform to spatially varying appearance. Our simulator also provides high controllability of maze appearance. Users can directly and intuitively “paint” the desired appearance of mazes in a spatially varying manner via a set of brushes. In addition, the evolution nature of our method naturally generates mazes without any obvious seams even though the image is a composite of multiple sources. We validate our method by evolving several interesting mazes from different source images.





(PDF, 6.0M)


(Divx6.8, 26.4M)




    author   = {Liang Wan and Xiaopei Liu and
                Tien-Tsin Wong and Chi-Sing Leung},
    title    = {Evolving Mazes from Images},
    journal  = {IEEE Transactions on Visualization
                and Computer  Graphics},
    month    = {March/April},
    year     = {2010},
    volume   = {16},
    number   = {2},
    pages    = {287-297},




Winged Guardian Bull
(to be available)