ClayFlow is concept clay with sensors and electronics embedded. Somewhat inspired by Perfect Red, Conductive Paint, and electronics integrated in paper, ClayFlow allows a person to observe the world and change it as they wish and get data and information from what they changed. For example, say I have a lamp that is broken and I want to fix it. I can wrap ClayFlow around the broken lamp and it will seal. Because ClayFlow has sensors embedded in it, I will know if the lamp is about to break again or if anything has happened. Basically, the conceptual idea is to be able to fix anything, add anything, change anything with the shapes you create and also know what is going on because of the sensors.
We discuss clay often as the “ultimate” radical atom – if it could be programmed. But no one has dared to actually try putting sensors/actuators in clay itself. Basically Sugru + SensorTape. Sensing us usually much easier than actuating. See sandscape. How can we better sense the shape etc. of clay?
Udayan’s comment: This is interesting. I am assuming you have already seen CMU’s claytronics vision video. There is some precedence in this direction already. Take a look at Patching physical objects, deform and KiCad.