A smart home vs an automated home
-
Once a method of sensing people is selected/found, then mySenors can be used as the transport layer. This leads to the question of the actual "smarts". The various mysensors supported packages seem to track state, allow control, and have scenes, which are good data and tools for the smarts to work on, but dont seem to be smart themselves. Am I overlooking something?
Commercial products use the 'cloud' to gather a lot of data from local devices, and create an AI of sorts that local devices then query for the appropriate response to specifuc conditions. I'm not interested in sending all my data to the cloud, so Im interested in completely local solutions.
Again this doesn't currently exist(that I know of), but many pieces do. Some are just pieces (hadoop for storing data e,g), some are partway there (mycroft ai e.g.), some have large backers (movidius ai accelerator). Some assembly required.
Are there more complete solutions that I may not know of?
What goals do others have? -
@wallyllama said in A smart home vs an automated home:
@NeverDie nice! Sparkfun has a breakout that is 20% cheaper than just the omron sensor. This is is getting closer to my price range, the radar modules are cheap and might be fun, but this would likely yield a working solution sooner.
Is this the one you found? https://www.sparkfun.com/products/14289
@NeverDie no it was an amg8833 breakout and at adafruit not sparkfun sorry, $39us. Mouser and digikey have just the sensor form$22us in small quantities.
-
The amg8833 has an 8x8 grid, and a 60° field of view, so if you have 8' (2.4m) ceiling that will cover a square with 14' (4m) sides at the floor. One pixel will be about 1' 9" ( 44cm) at the floor. That should be plenty of resolution even without interpolation. I suspect interpolation could give an effective grid of 16x16 at least, maybe more.
Careful planning and mounting in a corner or on a wall would have some trade offs, but might allow for covering a larger area with one sensor.
One trade off is identification. Is that heat blob a person or @gohan 's cat? That might be doable, but is it Mom or Dad, or teenager would probably need supplemental information.
Stationary heat sources, lamps, vents etc, could be filtered out, in probably several different ways. I have some large windows that may blur the data, but this isnwhere situational awareness wouldmc9me in. E.g. if (curtains == open && tod == daytime) then apply filter to pixels x through z, maybe time of year etc.
Other obstacles would probably look like cold spots and unless they are large wouldn't affect detection of people. They might dim a bit, so maybe a filter would be needed here.
This is quite doable. I've been thinking about it for a while and seeing usable sensors for effectivley 1/2 price has me a bit excited. I appologize if I have monopolized the podium a bit.
-
You are pretty much facing the same problems as all the engineers working on self driving cars or whatever is using computer vision (which is going to be tricky to be handled by an arduino alone, and that is why many services are relying on cloud computing)
-
You are pretty much facing the same problems as all the engineers working on self driving cars or whatever is using computer vision (which is going to be tricky to be handled by an arduino alone, and that is why many services are relying on cloud computing)
@gohan true for the larger goals, but this sensor is 64 pixels (256 w/interpolation) and we need to track a dot, i think an arduino could gather the data, do a bit of preprocessing, and (the mysensors part) transmit the data to a raspberry pi for "whole house" tracking.
This is pretty low res and I think a pi could handle it. If not, Intel has a movidius usb stick meant for computer vision/ai acceleration, I believe opencv has been ported to it. So while this is on the edge, some of the blood has dried.
The other plus is houses move slower than cars, unless people are running indoors, a 2 to 3 second refresh rate should be accurate enough.
This is a large project and mysensors would only be a portion of it, so for now I'll try to limit myself to talking about how a node based on this sensor would work and if it fits into mysensors properly or not. There is plenty there to discuss.
@dbemowsk again sorry for hijacking your thread, I'm going to look at the guides for submitting a node to openhardware.io, i dont promise I'll be fast so dont stop working your own ideas.
-
@gohan true for the larger goals, but this sensor is 64 pixels (256 w/interpolation) and we need to track a dot, i think an arduino could gather the data, do a bit of preprocessing, and (the mysensors part) transmit the data to a raspberry pi for "whole house" tracking.
This is pretty low res and I think a pi could handle it. If not, Intel has a movidius usb stick meant for computer vision/ai acceleration, I believe opencv has been ported to it. So while this is on the edge, some of the blood has dried.
The other plus is houses move slower than cars, unless people are running indoors, a 2 to 3 second refresh rate should be accurate enough.
This is a large project and mysensors would only be a portion of it, so for now I'll try to limit myself to talking about how a node based on this sensor would work and if it fits into mysensors properly or not. There is plenty there to discuss.
@dbemowsk again sorry for hijacking your thread, I'm going to look at the guides for submitting a node to openhardware.io, i dont promise I'll be fast so dont stop working your own ideas.
To what degree will detection range be an issue with these sensors?
-
@gohan true for the larger goals, but this sensor is 64 pixels (256 w/interpolation) and we need to track a dot, i think an arduino could gather the data, do a bit of preprocessing, and (the mysensors part) transmit the data to a raspberry pi for "whole house" tracking.
This is pretty low res and I think a pi could handle it. If not, Intel has a movidius usb stick meant for computer vision/ai acceleration, I believe opencv has been ported to it. So while this is on the edge, some of the blood has dried.
The other plus is houses move slower than cars, unless people are running indoors, a 2 to 3 second refresh rate should be accurate enough.
This is a large project and mysensors would only be a portion of it, so for now I'll try to limit myself to talking about how a node based on this sensor would work and if it fits into mysensors properly or not. There is plenty there to discuss.
@dbemowsk again sorry for hijacking your thread, I'm going to look at the guides for submitting a node to openhardware.io, i dont promise I'll be fast so dont stop working your own ideas.
-
To what degree will detection range be an issue with these sensors?
@NeverDie data sheet says 7 meters max, there is probably enough margin, at least for typical room sizes in the US. I think the 60° fov will be the bigger issue, getting coverage. Imagine you place the sensor in the center of your ceiling. The room is square, 14ft on a side and 8 ft high (~4X2.4 M), the sensors field of view would exactly cover the floor, but it is shaped like a pyramid with the sensor at the peak, so if you stand flat against a wall, only your feet would be in view.
@gohan's suggestion of bluetooth tags doesnt have that problem, it can be seen anywhere the signal gets to. You can have multiple detectors for coverage and triangulation. If you have a smartwatch or phone you always carry then youndont even need a separate tag. It is relatively cheap and simple, and most of the tech is done already.
(Now here is where I loop around and start spinning in circles) I dont want to have to carry anything, it should be possible to detect my presence by all the signals bouncing off me already, like light, or ir, or wifi, or radar, then the googling happens......
-
Perhaps an alternative definition of smart home could be whether it connects to the cloud? Or whether it uses Big Data / Machine Learning / aggregation of the habits of many households to find solutions to things?
Yet another, for me, is whether smart means 'ethical'. For example, a cloud connected home that shared my life patterns with third parties (which is most devices these days..) should never be called smart.
-
Perhaps an alternative definition of smart home could be whether it connects to the cloud? Or whether it uses Big Data / Machine Learning / aggregation of the habits of many households to find solutions to things?
Yet another, for me, is whether smart means 'ethical'. For example, a cloud connected home that shared my life patterns with third parties (which is most devices these days..) should never be called smart.
-
Instead of going off on wild tangents about privacy and the like, I suggest we re-focus by asking what good or useful thing we might accomplish if we could make the thermal 8x8 pixel sensor work. After all, this is the first thread to consider it, and it would be a shame to waste the opportunity.
-
Instead of going off on wild tangents about privacy and the like, I suggest we re-focus by asking what good or useful thing we might accomplish if we could make the thermal 8x8 pixel sensor work. After all, this is the first thread to consider it, and it would be a shame to waste the opportunity.
@NeverDie I agree but it is pretty much related as I really don't think image processing could be done on a microcontroller without the help of a backend server that would actually collects all data from sensors, correlates them and then give them a meaning that can actually be used.
-
In that case, I suggest @wallyllama start a new thread devoted just to the sensor and how best to make use of it. I wager something can be accomplished without resorting to full blown data fusion. Plainly if you tie your success to difficult, unsolved research problems that have long resisted solution, you will quickly bog down.
The two obvious things are: direction of movement and, as has already been mentioned, a finer location granularity within a room. Since it's thermal, it could know that you're sitting on the couch even if you're not moving. That's big. Just think of all the occupancy sensors that wrongly conclude the room is empty if nothing is moving. We've all had that experience, I'm sure.
-
You guys are thinking of a complex solution to this that is a single package that does it all. What if you dumb down the scenario a bit. Don't try to think of making a determination using only one type of sensor. After doing some more research on the guy that did the infrared doorway sensors, he said that it was a pretty reliable way of counting room occupants. Maybe you use the infrared doorway sensors as a way of counting the number of people in an area. Now you have a reliable way of counting the number of people in an area, now you start looking at ways of identifying who those occupants are if needed. Thinking in a broad sense, putting some fuzzy logic behind data from a number of other sensors, whatever that may be, may give you some kind of fingerprint for a person that could be used to identify people. using that approach may give you a little better accuracy too depending on the sensors and logic you use.
-
You guys are thinking of a complex solution to this that is a single package that does it all. What if you dumb down the scenario a bit. Don't try to think of making a determination using only one type of sensor. After doing some more research on the guy that did the infrared doorway sensors, he said that it was a pretty reliable way of counting room occupants. Maybe you use the infrared doorway sensors as a way of counting the number of people in an area. Now you have a reliable way of counting the number of people in an area, now you start looking at ways of identifying who those occupants are if needed. Thinking in a broad sense, putting some fuzzy logic behind data from a number of other sensors, whatever that may be, may give you some kind of fingerprint for a person that could be used to identify people. using that approach may give you a little better accuracy too depending on the sensors and logic you use.
-
@dbemowsk that "fuzzy logic" is what big companies are spending millions to develop and that's why I'm not expecting much from a few sensors and Arduinos
@gohan I get it, but at this stage, any bits and pieces that you can put together that can even do a fraction of it is better than nothing. I figure if I can start with the counting people part and somehow layer things on from there I'll be a little ahead of the game.
I don't want anyone to worry that you are hijacking my thread. Speak freely, this is how good ideas come to be. Precisely why I started this thread. I figured it would spark some creativity from the community.
-
So looking at the MySensors end of this, would it be too far off to think of adding a new node type, "person". A person node could have customizable properties that would allow you to define different useful bits of data related to that person. For example, preferred room temperature, or preferred light level. Heck, you could even have a room or area property that would get set when the system sees you move to a different area. So when you do figure out better occupancy sensing, you can automatically set user preferred light levels and room temps based on who is in the room, and dial them down after a person leaves the room. If you have more than one person in a room, it could take an average of the properties of all in the room to determine a setting like room temperature to provide a happy medium.
-
I think @dbemowsk is hinting at something that fits my understanding of "emergent behavior", individual simple things interact and create more complex results. How many, and whom are different questions. Counters in doorways plus a list of whose phones are at home, maybe add in some hostorical data of who likes to sit in which chair. There are probably better combinations, but that is what I got from hisnrecent comments.
-
So looking at the MySensors end of this, would it be too far off to think of adding a new node type, "person". A person node could have customizable properties that would allow you to define different useful bits of data related to that person. For example, preferred room temperature, or preferred light level. Heck, you could even have a room or area property that would get set when the system sees you move to a different area. So when you do figure out better occupancy sensing, you can automatically set user preferred light levels and room temps based on who is in the room, and dial them down after a person leaves the room. If you have more than one person in a room, it could take an average of the properties of all in the room to determine a setting like room temperature to provide a happy medium.
@dbemowsk said in A smart home vs an automated home:
So looking at the MySensors end of this, would it be too far off to think of adding a new node type, "person". A person node could have customizable properties that would allow you to define different useful bits of data related to that person.
I'd like to hear more about how you would use it. Below is my two pennies worth.
My thinking is that mysensors is a transport for relatively simple data, like state, value, counts etc, things nodes would need to set the environment up, or report back to central command.
A complex object like "person" could have all kinds of attributes and preferences, which would modify values sent to nodes. Example the curtain controller knows to open during the day, close at night, and maybe close for an hour at 10 am in the summer when the sun shines directly in and heats up the house( could also be a light sensor), but if the weather says it is clear, and kent is in the living room and it is night open the curtains up, and that would be an override coming from central. The node controlling the curtain doesnt need to know it is me in the room, it just needs to accept the modifiers.
I say this mostly because as @gohan points out arduinos arent terribly powerful, and telling them too much info may just confuse them.
I liken it to the body. E.g. your finger doesnt have to know if you are walking up as you are pushing a doorbell, it just extends on command and reports that it made contact, moved forward slightly and hit a stop. Your spine may get involved if the finger reports excessive heat, or something gooey on the switch, and pulls the hand back in reflex.
-
I think @dbemowsk is hinting at something that fits my understanding of "emergent behavior", individual simple things interact and create more complex results. How many, and whom are different questions. Counters in doorways plus a list of whose phones are at home, maybe add in some hostorical data of who likes to sit in which chair. There are probably better combinations, but that is what I got from hisnrecent comments.
@wallyllama As to your first comment about "emergent behavior", that's pretty much what I was getting at.
As to the MySensors node, my thoughts when I mentioned the "person" node were possibly some kind of MySensorized identifier or tag for a person much like a bluetooth tag. The more I thought about it though, you are correct that there would be all kinds of attributes, and most of them wouldn't need to be tied to the tag. The "person" though might be on the controller side where the processing power is greater and where most of that data would be dealt with anyway.