Are Your Presents Spying On You?

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Are Your Presents Spying On You?. The summary for this episode is: We discuss the security and privacy of connected gifts this holiday shopping season.
Now w/ 100% More Director of Security Research
00:24 MIN
This Episode Is Not In Time To Help w/ Holiday Shopping
00:36 MIN
Jerry Has An Extensive Background In Breaking and Fixing Things
01:09 MIN
Security Adds Friction; Friction Hurts Adoption
01:44 MIN
Holiday Pro Tip
00:44 MIN
IOT On Wheels
01:38 MIN
Go Check Out Mozilla's *Privacy Not Included Report
01:57 MIN
Ring Doorbells
04:10 MIN
Dyson Pure & Does Anyone Read the EULA?
03:21 MIN
Hamilton Beach Smart Coffee Maker
01:06 MIN
VOICE ASSISTANT ALL THE THINGS!!!
00:36 MIN
Moleskin Smart Writing Set & Corporate Espionage
02:32 MIN
Early Adopters & The Pointy Head Boss Crowd
03:09 MIN
DJI Mavic Mini & Vulnerable Apps
01:42 MIN
KidKraft Amazon Alexa Kitchen & Direct Marketing To Children
03:45 MIN
Wearables Are Tried and True Data Sprinklers
02:41 MIN
DNA Tests - Just Don't Give These As Gifts
04:55 MIN
Solving Cold Cases & Determining Insurance Premiums
01:50 MIN
Jerry's Final Tips
05:43 MIN

Dan Mellinger: Today on Security Science: Are your gifts spying on you? Hello, and thank you for joining us. I'm Dan Mellinger. And today, we're discussing the security and privacy of devices and gifts this holiday shopping season. My guest today has a ton of experience both securing and finding insecurities in all sorts of devices, from pen testing infotainment systems to finding remote access and control in Google Home Hubs, Kenna Security's resident dangerous device discoverer and head of security, Jerry Gambling. What's up, Jerry?

Jerry Gamblin: Not much. How's it going, Dan? Happy Holidays.

Dan Mellinger: Happy Holidays. And maybe not so much after we go through this list.

Jerry Gamblin: Well, if you don't have them ordered by now, you might be running a little late. I think I saw on the news today that the USPS was at 130% of capacity.

Dan Mellinger: Geez.

Jerry Gamblin: If you need to order something, do it today.

Dan Mellinger: That is a good point. I mean, yeah, shipping this year and COVID, with people not going out and shopping must be insane. I wonder how FedEx and UPS are doing as well.

Jerry Gamblin: Probably the same. I know that they're at my house nearly every day.

Dan Mellinger: Jesus. Yeah. If the amount of boxes I've been receiving is any indication, they are making some money at least. Anyway, well, here, let's get back on topic real quick. So Jerry, I wanted to go back and do a little bit of history with you because you have some notable examples and experience actually discovering some of these vulnerabilities. I know I specifically worked with you on the Google Home Hub piece from... What was that? 2019.

Jerry Gamblin: Yeah. Long time ago.

Dan Mellinger: Might as well been 10 years ago.

Jerry Gamblin: It was B. C., before COVID.

Dan Mellinger: Seriously, before COVID. Yeah, so I know one of your hobbies, so your day job is keeping Kenna Security secure, helping with our privacy policies, managing our entire infrastructure, all that good stuff. But I know you on the side, Jerry Gamblin, in case anyone who's listening doesn't know, hacked your NFC tile to be your RSA badge back in the day. You like to break things. Can you just give us a little background on some of your experience?

Jerry Gamblin: So I don't like to break things. I like to figure out how they work. Right? It started as a kid taking apart an alarm clock and never getting it back together. Now I'm a little bit older and my toys are a little bit more expensive, and they're all connected to the internet now. I sit here between a Google Home that I use to listen to podcasts during my workday and a TCL TV that I use as a big monitor. Right? They both are on the internet and they both have APIs. The deal is that they were designed to be on home networks. So security is not thought of very highly when these are built because they're building these devices to put into people's home, and they want them to work, and security adds friction, and friction adds supports costs. And these companies don't want to ever have to deal with a security control getting in the way of their product functioning.

Dan Mellinger: Adoption, right?

Jerry Gamblin: Yeah, adoption. So we talked about this when we were talking with Google. Right? They're like, " Yeah. We know anybody on the network can factory reset one of these. We didn't want to add a password that people had to add when they really wanted to do this legitimately," so it's a cost of doing business. So at the end of the day, these devices have zero security built into them. They're designed to be never exposed to the public internet. So every time there's a CVE or a vulnerability found for a class of one these devices, always read the articles and see how many of them are exposed to the internet, and it's normally less than a couple hundred to a handful because while you can reset everybody's Google Nest Home, you have to be on their network. And that means that you should know them, and that means it's a really jerky thing to do.

Dan Mellinger: I'm sure Jerry's never done that to anyone for fun.

Jerry Gamblin: Nope.

Dan Mellinger: Ever.

Jerry Gamblin: Because I have to normally... On the other side of that, being the breaker, I'm also the resident fixer, so if I did delete everybody's thing, I would be the guy they called to come in and help them reset it back up 90% of the time.

Dan Mellinger: When you get into those heated family debates around the table, you go break something real quick so they get you to go fix them. That's a good out at the dinner table.

Jerry Gamblin: Hey, Aunt Peggy, is your printer still acting up? Let me go and download these 700 megabytes of printer drivers over your 1. 5 megabit DSL. I'll be back in three hours.

Dan Mellinger: Yeah. You haven't updated in 15 years, awesome. Okay, well, some good background there. Oh, real quick. You also do some pen testing on infotainment systems for cars and stuff, right? As a side gig.

Jerry Gamblin: Yeah. Cars are nothing more than IOT on wheels these days. Right? And it's the same way, most of the newer cars have changed their models to give you updates for free, but that's only been the last couple years. If you ever stood in anybody's 2015 Toyota Corolla, it feels like a 2015 Toyota Corolla when the infotainment system comes up because they don't get updates. So there are a ton of bugs that are just in those older cars that'll never be patched because it wasn't in Toyota's R and D budget or their long- term ownership guide to keep a team of 10 or 15 developers on to update that OS. So when it shipped, that's the final version that it has.

Dan Mellinger: Yeah. I mean, most cars up until really, really recently, you had to drive and manually update your maps. Right?

Jerry Gamblin: Yeah. And a lot of the cars now come with the subscription service, so people are like, " Oh, what is this for?" Well, that $100 a month or $50 a month is for the company to afford to keep a team of developers and QAs on staff to be able to provide that update because nobody... Car companies aren't going to pay developers to develop software that they're not going to make any money on.

Dan Mellinger: Oh, man. The days of OTA updates for cars are here.

Jerry Gamblin: Yep, for sure.

Dan Mellinger: Interesting. Well, all right. I'm going to get into the meat of this discussion because we're talking about privacy and security of some of the gifts you may be thinking about giving this year. And so we're basing most of this on Mozilla, so they've done their third annual Privacy Not Included report, which is a shopping guide which identifies connected gadgets and toys and whether or not they're secure and trustworthy in one chart. So it's a really, really cool site. They've done a really good job over the last few years building this out, so we will link to that and you can go play around with it. I'll do a quick overview. So Mozilla researched 76 popular connected gifts available for purchase in the US. And the six categories they did were toys and games, smart homes, entertainment, wearables, health and exercise, and pets, which is interesting. So they looked at the privacy policies and read all the EULAs that none of us ever do before we click okay. They sift through the product and app specs, and basically, they're looking for things like: Can you delete your data? How is the data stored? Can this stream live footage? Things like that. Does it require passwords? Does it collect biometric data? Things like that, to see how secure, insecure they think it are. They'll give it a little bit of a rating. And for this year, they actually have what they call a creep o meter, so it's a tool where you get a vote, and people can, shoppers can rate the creepiness of each product. And they use an emoji from super creepy to not creepy at all. And so I picked some of the examples of devices here. We're going to go through those, and then at the end, we'll get Jerry's tips and tricks overall. So wanted to get that out of the way. We'll go ahead and link all that in the show notes for the podcast, so if you're listening on Apple Podcasts or something like that, go to kennaresearch. com/ podcasts and you can get the notes there. So with that, Jerry, we'll go right in with the smart home/ office category. Ring Doorbell, it got a 48% super creepy rating. And they're owned by Amazon. So what are some of the security implications with the Ring Doorbells?

Jerry Gamblin: Well, the biggest security implication is that they have a pretty open sharing policy, and they'll share your data with anyone. Police officers or police organizations can sign off and build a neighborhood watch program so that they can access the data from your camera. It also has a pretty heavy AI component, where they can start recognizing faces. I have the Nest Doorbell instead of the Ring Doorbell, but it has the same thing. It'll tell you when my son comes home from school. It knows that's his face and has started tagging it automatically. It'll also tell me when there's a package put in front of my house and send me a text when it's picked up, so it knows when that FedEx guy comes and drops off a package, and sends me a message. But all the data isn't done on the doorbell. It's done in Google and Amazon's giant machine learning, AI databases. And I don't know for a fact what they can and can't use that data for. Are they using my son's image or my image to build a giant model to sell to somebody else who wants it? To a retailer, so that they can say, " Oh, there's Jerry. We know him from his doorbell, and he's in your store now. So here's what he likes, or show him this ad."

Dan Mellinger: We're going to track you in our Amazon Go stores now.

Jerry Gamblin: Yeah, exactly.

Dan Mellinger: And shoot products into your hand because we know what you want, Jerry.

Jerry Gamblin: Here, Jerry, we have your order all ready for you.

Dan Mellinger: Wait, what?

Jerry Gamblin: Take out what you don't want from this basket. It'll make it faster.

Dan Mellinger: Seriously. Yeah, so owned by Amazon. I know Amazon's had a ton of questionable privacy practices in the past. And they don't exactly have the best track record there. I think they did this year with BLM movement say that they would not sell facial tracking to law enforcement. Does that count? Does governments count?

Jerry Gamblin: That's probably another podcast, or actually an off the record discussion over some whiskey probably.

Dan Mellinger: Oh, that's definitely happening once we get this vaccine rolling. So I know Mozilla, there was concern about, like you said, law enforcement. They have access through the Neighbors app, which is Ring's I guess community based app. And you can actually report issues directly to police through the Neighbors app, and it'll automatically share your address, information, and police can download your video that you let them. And once they do that, where does it go? Do they have a privacy policy? Those are some interesting concerns as well to think about.

Jerry Gamblin: Yeah. But so many porch pirates here, your Amazon boxes, you don't really think about that. Right?

Dan Mellinger: It's true. Instant send, get them.

Jerry Gamblin: Yep. I want my Amazon order back. Right?

Dan Mellinger: Yeah.

Jerry Gamblin: Once again, these are ethics questions as in a whole. But when it comes down to individuals, we want the options to be able to do this and the ease that it brings to our lives. So we're willing to always trade a little bit of our privacy personally for it. But when you start talking about these questions and this community wide ethics is where they always start to get the, " Hey, do I really want to do that?"

Dan Mellinger: Awesome. Well, I normally am an early adopter of these kind of smart home gadgets and things like that. But I live that San Francisco apartment life, so no doorbell cameras yet, which I guess is a good thing. I did not know that they could tell different cats from each other that are walking in front of your door and stuff like that, but that is crazy. All right. So let's move on. Next one, Dyson Pure Cool. So this is something I do have. It's basically an expensive fan. This one received a 35% of votes in the middle creepy section, so it wasn't super creepy, it wasn't not creepy. 35% was the most, right in the middle. And the issue here is basically comes down to privacy. Right? So sharing personal data, and Mozilla in this case was just more concerned that they don't know what their privacy data rules were, basically. Can you go into a little bit of detail on EULAs and what companies could do with your data?

Jerry Gamblin: Well, personal data is questionable there. I have the same fan. And what it shares is what your air quality is so that it can build kind of a map. And I guess the personal data is that mine always kicks on when I cook because I'm such a bad cook. So it's like, " Oh, Jerry's cooking." So it is part of an ongoing thing because you can build something like that because you do know. Reading the EULAs is hard, and I will admit I just scroll through it and hit click quite a bit because I'm not a lawyer and I don't play one. And I always want to get to my new toy, so I'll pretty well accept anything. But sites like this are really important that go through and do the work for you. And then there were also, before COVID, there was a big push by state legislatures, especially in California and New York to come up with a must be able to read level for EULAs, like an eighth grade common English EULA that says, " You can have your 40 page EULA that's all in legalese, but you need to have a rider on top of it that in plain English that 95% of the population can understand, what data you collect and what you do with it."

Dan Mellinger: Explain like I'm five version.

Jerry Gamblin: Yeah. Exactly. And I would really like to see something like that come out because it is so difficult to understand what these do. I kind of work at Kenna on some of the contracts things, and the difference between if and then, or can and may, will drive you crazy sometimes. Right? The other Jerry, who does all of our contracts here at Kenna, hates the word can because: Does it mean that you're going to? Or does it mean that you might? So it's words like that, that unless you're a lawyer and you understand what the nuances are can change the whole meaning of a EULA in one three letter word.

Dan Mellinger: And I will say too, so my wife actually is a lawyer and she actually does contract review for one of these really, really, really big AI, ML companies that helps you find things on the internet, and she doesn't read our EULAs. I mean, even practicing contract lawyers do not read EULAs apparently, so I think you're in good company there. But yeah, I mean, that makes a ton of sense. A lot of the stuff, like the next one here as well, the Hamilton Beech Smart Coffee Maker. Right? There's no express it's spying on you. It's a coffee maker. Right? It doesn't have the ability to... It doesn't have cameras. Right? It doesn't have any microphones. It works via you talking to your Google Home, or Alexa, or whatever. Right? So it's not literally spying on hey, Jerry said this, I'm going to report to Hamilton Beech. It's mostly about who owns your data, and that's a lot of the laws in California, as well, and the EU in particular. GDPR and all that. Who owns your data? And so with the Hamilton Beech Smart Coffee Maker, it's very similar. So Mozilla had some issues which got a inaudible of the way, 39% of users rated it super creepy, mostly issues with privacy policy. And so it's not clear if users own their data and if they can delete that data from Hamilton Beech.

Jerry Gamblin: And I doubt anybody at Hamilton Beech thought about that when they decided to add voice recognition to their successful coffee maker. Right?

Dan Mellinger: Yep.

Jerry Gamblin: It's adding IOT to something is the new version 2.0. Right? How much stuff are we buying that just has Google Assistant built in? Updated my Dish Network receivers and now they have Google built into it for some reason. Right? Which is, oh, neat, the microphone has a Google Assistant, but I already have a Google Assistant in my kitchen, and so I don't need that third one. But it's a quick selling point, and everything is getting these voice assistants added into them without I'm guessing a lot of thought by the companies that they're doing it.

Dan Mellinger: Yeah, yeah. No, that makes a lot of sense. And then the last one, Jerry, I believe you're a fan of Moleskine. Right?

Jerry Gamblin: Yep.

Dan Mellinger: Yeah. I'm a huge fan of Moleskine. I love their notebooks. They're a favorite. I've been buying them for a decade. So the last one here is the Moleskine Smart Writing Set, so this is a set of a notebook and a pen that digitizes your notes and automatically uploads them, which honestly is pretty cool. I think that's pretty bad ass. I'll put this out there that it has a 70% super creepy rating, by the way, so this is the creepiest one that I've seen this far on Mozilla's privacy site thus far, Privacy Not Included research thus far. But essentially, this one goes back down to privacy, so their policy only cover their website. It doesn't cover the pen. It doesn't cover the app that you use to upload it. And one other thing, this goes to exactly what you just said, so they said that you can delete your data, but they seem to have left out any section of contact us in their privacy policy. So Mozilla was like, " Can we even contact them? Because there's literally no contact information in the privacy policy after they state you can delete your stuff if you contact us."

Jerry Gamblin: Yeah. That's why I would never have one of those because in meetings when you write down, " This is the stupidest idea I've ever heard," and then your boss says that we're going to do that, and you want to delete that note, you can scratch it out of your Moleskine notebook.

Dan Mellinger: Your Moleskine.

Jerry Gamblin: But getting it out of their cloud might be a little bit of a different story. Right?

Dan Mellinger: Automatically emails Ed, " Jerry wrote this."

Jerry Gamblin: Yeah. No, I mean, it's great, especially when you talk about stuff that's in what is maybe somebody's personal journal, or somebody's proprietary notebook. How much would that be? How big of a target would that be to someone doing corporate espionage? If somebody had a startup doing really cool car research, had one of these, and every note they were taking was getting automatically delivered to some cloud that I'm guessing Moleskine isn't in the securing cloud business very well. So might be an easy target, might not, I've not looked at the product. But that's not their core competency.

Dan Mellinger: I'm sure it's just in a default S3 bucket somewhere.

Jerry Gamblin: Yeah. Yep.

Dan Mellinger: Yeah. I mean, it's interesting too because when you think about something like smart pen, it's not only one of the value adds to your point about UX and getting out of the way, is normally OCR. Right? So character recognition, converting handwriting into searchable, readable, machine understandable text. So yeah, it's basically an index of everything you've written, on top of being able to... What was our main security measure for hundreds and hundreds of years? Was our signatures, so now it just has your handwriting, so it's interesting.

Jerry Gamblin: Kind of to get not off topic, but kind of stay on topic a little bit, you see a lot of these technologies early on get adopted by what I like to call the pointy hair boss crowd. People who have a lot of disposable income are bought or are early adopters to these. When I was young and I couldn't afford the iPhone seven or whatever, you know who had one? The CEO on the day they came out because they got one. They always had the latest and greatest. So you have people who don't have an opportunity to play with these needing to secure them because people on their C suite are getting them. This$ 500 pen might never end up on your security team's desk, but your chief financial officer got one because they're super hard to buy a holiday gift for, so their loved one bought them one. And now they're taking all the company's notes on that without any way for their security team to review it.

Dan Mellinger: That's interesting. Yeah.

Jerry Gamblin: That always gets me because you see that so, so much, these early adopters are normally on the upper echelon and people that are likely to be targeted because of what they do.

Dan Mellinger: That's interesting. So you know what that reminded me of, Google Glass and Glass holes.

Jerry Gamblin: Yep. Exactly.

Dan Mellinger: It was all rich tech people. Right? I remember I was just starting out when that was coming out, had not enough money to go buy one. Right? I actually made a reservation, tried, and then got a date to go and get fitted, and was like, " I can't afford to buy this." But yeah, that's interesting because the demographic that inherently has a lot of disposable income to play around with a smart pen when a 25 cent Bic is more than satisfactory for most people.

Jerry Gamblin: Yeah. But you need to get that person in your life who has that title something nice that's going to be cool to see in the office. I know that security people hate the week after holidays because all this new stuff comes trickling in and gets asked to get added to the network.

Dan Mellinger: Spoken like a true CSO. He's like, " Damn it, we got to deal with all these new chief executive gifts that come in."

Jerry Gamblin: Yeah. In the old days it was, " Hey, can you just put my new iPad on the corporate network?" Now it's, " Can you connect my new smart pen to the corporate network?"

Dan Mellinger: I hear smart glasses are making a comeback. Oh, God. I hope not. Anyways, actually speaking of random stuff, we'll jump into some toys here. So this is actually a pretty notorious one, DJI, so the biggest drone manufacturer on the planet, so we have the Mavic Mini here. It got a 53% super creepy rating. And this one is a little more traditional. So there's been demonstrated vulnerabilities on their Android app that enable the collection of a ton of personal data. And these DJI drones are essentially flying high definition cameras that record audio, location, and do facial tracking, and everything the Ring Doorbell does, but mobile and by itself.

Jerry Gamblin: You have to know that's what you're getting into when you buy one of these because A, the government wants to know who you are and where you're flying these at, and where you can and can't fly them at. Right? I know that I should be in Vegas this week for a reinvent, but I'm not. But I know that the whole area of the strip is a no fly zone because it's so close to the airport there. That has to be in the app, so they have to collect this data so that they can tell where you're at. So I think that this is one of those, they collect the data, but they probably have a reason to. I would be more concerned about some of the other gifts that kids get, like the connected, inaudible toys. Does this Barbie really need to be on wifi? Does this remote control car have to have Bluetooth enabled? Kind of stuff. That's kind of where I worry a little bit about that.

Dan Mellinger: Well, here, let's get into one, actually. The next one I had for the toy section was the Kidcraft Amazon Alexa two in one kitchen and market set. So this one also tied with the Moleskine with a 70% super creepy rating. I totally agree with this one. Basically, it's like this kitchen/ market set that's designed to teach kids how to shop by having them talk to Amazon Alexa. I mean, that may be the way that they only do things going forward, but I mean, having your kids having a direct conversation with Amazon seems a little interesting for a toy. So Jerry, thoughts.

Jerry Gamblin: How else is my son going to learn to copy me when I say, " Hey, Alexa. Add scotch to the shopping list," so that he can go back to his room and add that to his play shopping list? No, it's marketing, exactly. It's just trying to keep, cook that next generation of shoppers and make them used to using that tool.

Dan Mellinger: Yeah. And so I'll call this out. So Amazon does say that any child directed Alexa skills is what they call them when you say something to Alexa and it does something. They can't promote any products, content or services, or direct end users to engage with content outside of Alexa, so they can't link kids to other stuff via, I don't know, their smart kitchen set. They can't sell digital or physical products or collect any personal information. So Amazon knows that, hey, letting us talk to your kid directly and vice versa is probably pretty creepy, so they actually have a pretty strong privacy policy. So Mozilla says themselves that probably everything's okay. But again, this is more for me. Do I want my kids being marketed to? I already teach them, I think one of my... My son's just over two years old, and he already says, " Google," because we use our Google to play music and all that all the time. So he's like, " This is how I ask for things," which is kind of weird to think about. Yeah, just more of the marketing implications.

Jerry Gamblin: Yeah. I have a sixth grader. I will tell you that having a Google home is a God send for us because he'll ask me. He'll be doing his homework and ask me a definition of a word. I'm like, " Dude, instead of asking me, who probably knows the definition, but it might not be 100% accurate, why don't you just use the super computer on your desk, who will give you the dictionary definition of the word 100%, so you know that it's right?" So maybe a little bit of it is not doing my parental responsibilities 100% right. But if I don't know, that's what I'm going to do. So why not cut out the middle person here and go directly to Google?

Dan Mellinger: Yeah. Absolutely. So I think what's weird about the Kidcraft thing is it's a toy. So we use our Google Home Hub, but we use that with our son all the time. Right? So hey, show me a brachiosaurus. Tell me about that. Or we'll play music or ask him Dr. Seuss books, stuff like that. It's awesome. It's great for that. But we're normally with him, actively doing this stuff. I think I would be a little less, at least with my son's age, less prone to let him go play with a kitchen market set that is asking him to make his Amazon grocery list directly with the built in speakers. Right?

Jerry Gamblin: Yep.

Dan Mellinger: Yeah, just crazy time.

Jerry Gamblin: 100% true.

Dan Mellinger: Yeah, it's just interesting. So it's marketing, getting people ingrained, I guess like Apples in schools, people get used to it. So I don't know how I feel about that because it is what we do, to your point. I use Google when I don't know something. I'm going to go ask it upstairs. Right? Oh, man. All right. The fun category, wearables.

Jerry Gamblin: Yeah, if it's wearables, your data is going everywhere. It's one of those things that if you make a personal choice, I think that's fine and you understand that your data is probably being tracked and used by insurance companies, et cetera. It's when you start thinking about giving these as gifts that you really have to take a step back and think about, " Hey, do I know what I'm doing here?" Am I giving this to someone who doesn't have a giant digital footprint that I'm going to now increase their giant digital footprint by giving them an Apple Watch, or a Fit Bit, that's going to have all this data flying around on the internet?

Dan Mellinger: Yeah. Well, I don't have any of the specific ones here because when I looked at the wearables that Mozilla was like, " Hey, you should look out for these ones," what's interesting is, I think wearables as a privacy and security standpoint, they've kind of taken a lot of the brunt of the news and coverage on how dangerous are these, people have been talking about it for a long time. I think you'll mention this at the end with your tips, but one thing that I notice was that almost all of the wearables, it wasn't Apple. It wasn't Fit Bit anymore. Those companies who are established, that security and privacy concerns probably took a hit to their bottom line, so they corrected that, or at least wrote policies to seem like they did. Most of the wearables that Mozilla was concerned about were some of these new ones out of China, so Huawei, Xiaomi, tend to be more inexpensive, tend to be owned by large Chinese conglomerates. And not to be xenophobic or anything like that, but I mean, there's concerns about privacy of Chinese companies because: Can they be compelled by government to give up your data, all that fun stuff?

Jerry Gamblin: Yeah. It's either that or who owns the data at all. Even not large companies, just Bob's Electronics that you've never heard of before, are they going to be around in three weeks? Do they, outside of, we talked about Hamilton Beech having a security team, do you think that wearable that you spent$ 10 for has a security team? Probably not.

Dan Mellinger: Yeah. I mean, if it's a Huawei and a Xiaomi, I'm sure they do. I think a lot of the concern is more about I guess government control, government has a lot more leeway in China to go pull data, stuff like that. Not that US government doesn't, right? I mean-

Jerry Gamblin: If it's in the cloud and a big company has it, I don't expect them to put up a giant fight to protect my personal data. Right?

Dan Mellinger: Yeah, exactly. Well, and here, I mean, getting into the last kind of super privacy area. What about buying those DNA tests, 23 and Me.

Jerry Gamblin: Oh, man. Those are so, so popular right now because it's interesting because everybody wants to know where they came from and what their background is. I think the commercials even kind of play on it a little bit. Right? It's like, " We're not Irish?" I think at some point I saw a study that said 60% of Americans think that they have Irish background when it's 12% or something. So these are really popular gifts to kind of really nix some of those folklore tales about where your family came from. And they're really interesting. What doesn't get covered is what those EULA allows and what can be done with that data. Hopefully, there's no serial killers in your family that are unknown. But a big one in California was caught because his granddaughter did a 23 and Me.

Dan Mellinger: Wait. What? I haven't heard about it.

Jerry Gamblin: I think it was the postal serial killer. I'll find the link and send it to you. We can add it.

Dan Mellinger: Yeah. We'll add it to the podcast notes.

Jerry Gamblin: It was a big deal. That's how they broke the case. The case was cold because they had the DNA but no matches.

Dan Mellinger: No match.

Jerry Gamblin: But somebody found a match through because the granddaughter did a DNA test. So it's being used for criminal investigations, which is fine, and what you might expect it to be used. But it can also be, this data can also be used for health data. Right? I was walking through Target earlier this week, and instead of just 23 and Me telling me where I came from, they had a whole section. What's your metabolism? Are you at a high risk for cancer tests? What's your kid's eye color going to be? So they're already pooling this data, and if you have data, you know it can be used by insurance companies because insurance companies love data. So does this DNA kit that you're buying your loved one really just give them a bigger bill for their insurance in the future? And the answer is I don't know because last time I looked, the EULA for DNA kit's 120 pages of legalese. Right?

Dan Mellinger: Yeah, yeah, yeah.

Jerry Gamblin: So I don't do that, so I might skip that and let them buy that on their own because I would hate to be the person who gave someone a higher insurance bills for the rest of their life.

Dan Mellinger: Like, " Honey, why did our premiums go up by$ 30 a month?" Yeah, that's interesting. I mean, I've done an ancestry DNA test from ancestry. com three years ago, something like that. It was a while ago. Right? That data is a living entity. So if anyone's done these DNA tests over time, the apps actually give you more information as they test more people and get more granular globally, so yeah, my data's changed significantly over the last three years in the app. I'm half Chinese. My mom's from Taiwan, and so at first, they didn't have it seems like a lot of Chinese users. And so it was just these big globs. You're either Northern China or Southern China. And now it's getting super granular and starting to actually tag in here's some Korean DNA, all that interesting stuff. So the data is still being analyzed, still being crunched, and still being updated years later as well.

Jerry Gamblin: Yeah. And I think an important thing to remember is that sometimes family lineage isn't exactly what everybody thinks. And I've seen some horror stories of people giving these tests to somebody who comes to find out that they were adopted, or their lineage wasn't exactly like-

Dan Mellinger: What they-

Jerry Gamblin: What the story was.

Dan Mellinger: And Jerry, you're responsible because you gave them that test.

Jerry Gamblin: Yeah. Could you see giving your grandparents or your parents one of these tests, and come to find out that the person that they thought was their mother, their biological mother or father wasn't their biological mother or father? Right?

Dan Mellinger: Oh, that's crazy. Yeah.

Jerry Gamblin: Because you put it up there, and like you said, Ancestry will build you a family tree. You give it to your 70 year old grandmother and she does it, and then come to find out, it isn't linking her to who her grandfather was. Right?

Dan Mellinger: So maybe some new etiquette for DNA testing is let the individuals decide if they want to know that information themselves.

Jerry Gamblin: Yeah. It could turn into something that you weren't expecting, and then your COVID Christmas will also be the Christmas that you destroyed your family.

Dan Mellinger: Worst case scenario. Well, speaking of, let's go generally. Jerry's tips, we were talking a little bit about this before we hit record, but overall, it seems like there's a lot of paperwork, a lot of EULAs that people should in theory be reading, but are designed literally not to. A lot of privacy concerns, all that good stuff. What are some kind of tips and best practices for people who are trying to buy some cool gifts with a lot of stuff that is just connected nowadays, no matter what? You can struggle to find certain types of gifts that aren't. So if you had to provide some best tips and practices for those people shopping, what would that be?

Jerry Gamblin: I really try not to break somebody's habit of what they're building at their house. That means I try not to give them their first connected gift. And then I try not to get them something that's not in their ecosystem. So if you go to their house and they're all Amazon Hubs or whatever, don't buy them something from Google. If they're all Google stuff, don't buy them something from Amazon.

Dan Mellinger: Yeah. That's a good point. We're a Google house here, but I know our parents, or my wife's parents are Alexa. Right?

Jerry Gamblin: Yeah.

Dan Mellinger: That's all the stuff that they got.

Jerry Gamblin: You're going to beep all of these or people's computers are going to be, their home entertainment systems are going to be-

Dan Mellinger: Are we going to accidentally dox people?

Jerry Gamblin: All inaudible. Yeah. No, but I think it's important to not give somebody connected devices if that doesn't seem like what they use, and then to keep it in their ecosystem if it is. And I know that it's not great, but always throw that gift receipt in there and say, " Hey, if you're not comfortable doing this, feel free to take it back. It's not going to hurt my feelings," especially if it's on that creep o meter list. If you're going to give somebody a Ring Doorbell, you might say, " Hey, I think this is really useful. It'll help you track your packages and stuff," but there are a few downsides, like this data can go to the police officers. It's from Amazon. Just let them make the choice. And then I think one of the keys is to not buy off brand or cheap electronics if at all possible. As somebody who grew up with the majority of their toys coming from non name brand stuff, I know it's hard. I spent a lot of times at the Big Lots type stores growing up. And it looks the same, it seems the same, but it's not exactly the same quality. When you get to stuff that's handling your personal data, you kind of just want to take a step back and maybe not buy that off brand voice assistant or that off brand connected TV if you can help it because you don't know the quality. And those companies might not be around, and it might stop working in two or three years.

Dan Mellinger: For some reason, that just reminds, what comes to mind when you talk about off brand connected electronics is those digital photo frames that you can get at CVS and all that every holiday season they come around. Yeah, no. Point taken. That makes sense. The software on them alone is...

Jerry Gamblin: I mean, there's a good reason for this too. The Marai botnet, the biggest IOT botnet ever, was built off of these. It was built off of IP cameras, DVRs, that were made in China and shipped around the world for super cheap. But their downfall was they had telnet exposed to the internet. That's how they were managed. Nobody else's had that. The Google security camera or any reputable cloud security kind of camera stuff didn't have that flaw. But it was these cheap second tier kind of surveillance systems that had these connected to the internet, and somebody found the flaw and used it. And if you get somebody an off brand IOT device, you could be setting them up to be part of a botnet because they're the least secure and most likely to be hacked in that way.

Dan Mellinger: Well, and we didn't talk about this yet, but also, a lot of these devices like some of the wearables, but especially some of the stuff like the coffee maker and all that, you can't really secure those. You have to rely on updates that are proactively pushed in most cases to the device itself. Right? So if I give it to Aunt Nancy, she's not going to think about updating her coffee maker.

Jerry Gamblin: No, no. Not at all. You'll have to do it when you go over for Thanksgiving if that ever happens. Also, really watch it when you buy electronics for kids, especially drones. Drones is where I've seen a lot of this off brand kind of thing, because while DJI ranked creepy on that list, some of these drones that come off of Amazon that cost 30% or whatever, $20, $ 30, I've seen some of them that said, " Hey, you have to download the controller app for this outside of the Google store because we couldn't get it through there, or outside of the app store," so those are definitely where to watch and definitely likely to be siphoning data.

Dan Mellinger: Download our key logger that also controls your drone.

Jerry Gamblin: Yeah, exactly.

Dan Mellinger: Yeah. Interesting. Well, awesome. We're going to try to get this going so people can shop and buy, well, maybe return stuff by the time we get this published. But thanks for joining us, Jerry. Did you have any other tips and tricks? Or we'll save the surprise for which XKCD image would go for, for this one. But Jerry, any last words?

Jerry Gamblin: Have a happy holidays, everyone. Really try to enjoy your immediate family this year. And hopefully we're all back to craziness and Black Friday stuff this time next year.

Dan Mellinger: Hear, hear. Thanks, Jerry. Take it easy, everyone.

Jerry Gamblin: No problem. Thank you so much.

DESCRIPTION

We discuss the security and privacy of connected gifts this holiday shopping season.