The smart(shit)ification of TVs pisses me off.

Chahk@beehaw.org to Technology@beehaw.org – 542 points –

I absolutely hate "smart" TVs! You can't even buy a quality "dumb" panel anymore. I can't convince the rest of my family and friends that the only things those smarts bring are built-in obsolescence, ads, and privacy issues.

I make it a point to NEVER connect my new 2022 LG C2 to the Internet, as any possible improvements from firmware updates will be overshadowed by garbage like ads in the UI, removal of existing features (warning: reddit link), privacy violations, possible attack vectors, non-existent security, and constant data breaches of the manufacturers that threaten to expose every bit of personal data that they suck up. Not to mention increased sluggishness after tons of unwanted "improvements" are stuffed into it over the years, as the chipset ages and can no longer cope.

I'd much rather spend a tenth of the price of my TV on a streaming box (Roku, Shield TV, etc.) and replace those after similar things happen to them in a few years. For example, the display of my OG 32-inch Sony Google TV from 2010 ($500) still works fine, but the OS has long been abandoned by both Sony and Google, and since 2015-16 even the basic things like YouTube and Chrome apps don't work anymore. Thank goodness I can set the HDMI port as default start-up, so I don't ever need to see the TV's native UI, and a new Roku Streaming Stick ($45) does just fine on this 720p panel. Plus, I'm not locked into the Roku ecosystem. If they begin (continue?) enshitifying their products, there are tons of other options available at similar price.

Most people don't replace their TVs every couple of years. Hell, my decade old 60-inch Sharp Aquos 1080p LCD TV that I bought for $2200 back in 2011 still works fine, and I only had to replace the streamer that's been driving it twice during all this time. Sony Google TV Box -> Nvidia Shield TV 2015 -> Nvidia Shield TV 2019. I plan to keep it in my basement until it dies completely before replacing it. The Shield TV goes to the LG C2 so that I never have to see LG's craptastic UI.

Sorry, just felt the need to vent. Would be very interested in reading community's opinions on this topic.

234

You actually can buy quality dumb TVs, but you have to do the legwork and do research on what are often referred to as "commercial displays." I see them everywhere in businesses for ads and showing the menu. They're sometimes a little pricier, but they're usually built a little "beefier" too, as they're expected to deal with more rough usage in like a restaurant context.

However, the other solution is the one you've already mentioned where you never plug the Smart TV into the internet, and instead bypass the "smart" on the TV with your own streaming boxes.

I think as more people realize there is a market for dumb TVs, you'll start to see that market grow more and more until they no longer just "commercial displays." Just gotta get enough people buying them and not buying Smart TVs.

I think if enough people never gave them Internet access, the manufacturers would start adding in cellular modems to ensure they get the data flowing (that is, data on your viewing habits and sending you ads).

Having worked in this field, I can tell you how it usually operates: You want the most data for the least amount of investment. As soon as your operational costs start to eat into your already thin margins, the equation falls apart.

Complex solutions designed to capture data from that 1-3% of users who actively avoid it end up costing a lot more money than their data is actually worth. In order to make this particular solution work, you need to make enough money selling whatever tiny amount of data you get from those 1-3% of users to cover the cost of putting a cellular modem in all of your TVs plus the ongoing cost of paying various regional cellular networks to deliver that data to you. You are likely tripling or quadrupling the total cost of your data collection operation and all you have to show for it is a rounding error. And that is before we factor in the fact that these users likely aren't using the built in streaming apps, so the quality of the data you get from them is below average.

The cheaper option would be to set up an ad-hoc tv-to-tv network. You might not let your TV talk to the internet, but I bet your neighbour does, or if not, then their neighbour will.

The "Anti-Fraud Community Group" already thought of that:

https://github.com/antifraudcg/proposals/issues/17

Device mesh (Androids/Chromes) to share suspicious behavior

The proposal is to use the consensus between devices on genuine and suspect characteristics

A device should be able to query from a safe and reliable source if another device has performed (within a defined period of time) some malicious action similar to the one it is going to perform, so it could make the decision not to perform that same action, autonomously.

...just in case you wanted to install an ad blocker malicious software, or something.

I mean our computers and phones already do something like this while looking for available WiFi networks, so maybe it wouldn't be that farfetched. On the other hand, I just got a flashback to Jim Carrey in Batman and.... "the box" for some bizarre reason! šŸ˜‚

Since I live in a small space and game a lot, I have invested in a gorgeous 4k monitor and honestly love how all my movies look as well as games, so I have zero issues. It would be nice to someday buy a large tv that didn't constantly search, scan, and update crap I don't want or need, but I am not holding my breath they will reverse course.

It's amazing how Batman Forever predicted the then-future of television, up to and including most people trading in security/privacy for convenience.

1-3% of users might not be enough people, but what is the break-even % of people to justify adding a cheap cellular modem? 5%? 10%?

You are likely not even doubling the cost of the data collection operation. We're talking under $0.50 in additional hardware per unit, with a relatively low data usage requirements. The servers to collect that data are likely already more expensive, and you can easily sell user viewing habits for way more than $1/month/user. You can use a prepaid low usage data-only eSIM with global roaming for less than $5/year, only renew it for the devices that don't get hooked up to a user's WiFi. If it was only needed for 5% of the users, or 1 in 20, you could still get a ROI of under a year.

With a device life of 5+ years, it's definitely much more than a rounding error. Keep in mind the profits go directly to the manufacturer, so it's a % of product cost in origin, not of MSRP... which is pretty much the reason why all manufacturers have jumped onto the data collection bandwagon in the first place.

I feel like the market is only going to grow in the top end. Audio/videophiles sort of areas with large, high quality, top end feature sets.

The low end tends to be partly subsidized by the ā€œsmartā€ features. Think TVs that show ads in the menu, or Amazon or Google screens that want you to use their services because itā€™s ā€œeasyā€ and theyā€™re ā€œright thereā€ so maybe people will subscribe. Couple that with the ā€œfeatureā€ that itā€™s already built in so it saves you an extra box/purchase for people who want cheap TVs, and I donā€™t see it going away anytime soon.

Exactly this.

Manufacturers are NOT INTERESTED in selling low-cost dumb TVs when they can sell smart TVs and get long-term returns. They are even willing to sell the TVs at cost because they will monetise later with ads and selling your data.

Manufacturers don't want you to have a dumb TV, they want everyone to go smart - which is part of why business-targetted dumb panels are priced higher - to disincentivise regular end-customers from buying.

oh...is that why all these nice smart TVs are so affordable these days?! damn!!

Normal manufacturing efficiencies and cost reduction is surely the biggest reason they are cheaper now but it's absolutely a factor.

So many companies in so many industries are trying to move from being product companies (make money selling a thing) to being service companies (make money from subscriptions, user data and other monetisation) and I'm doing my damnedest to keep away from any of it.

It could get interesting with right to repair, that probably includes the right to load custom firmware...

Thereā€™s no down-side to selling a smart TV to someone who doesnā€™t want one/doesnā€™t use the features.

The features we ā€œwantā€ from modern TVā€™s like DolbyVision and all the shit they do the image to make it stand out in the store requires a significant amount of processing power.

Itā€™s simply better business to sell smart TVā€™s to everyone than to make dumb TVā€™s that compete for a tiny fraction of the market when people buy Smart TVā€™s in every price segment.

The paradox being that if therr were "premium" smart TVs for people like us - with proper support, privacy, customization options and no crap like ads - we'd probably buy it, and pay a premium for it.

But that's just too much work for them and they probably don't even realize that kind of market exists.

I think they know it, I just don't think they care. It's a niche market. On top of that, they'd have to convince the people in that market to trust them.

If they can get a 10-20% return on 10,000 Smart TVs, why waste the effort on properly developing and supporting 3 PrivaTVs (patent pending, exclusions apply, see your local drunk for details)?

I could be wrong, I just don't think the market is large enough that they'd be willing to throw manpower at it.

I think people underestimate the value of their tracking data. For a manufacturer, the benefits over the lifetime of the device, can be way higher than 20% the manufacturing costs.

They could still develop and support those 3 PrivaTVs, but the MSRP would easily be a few times higher than that of an equivalent Smart TV.

How many more times? If a regular TV costs $100 then they're making $20k plus marketing data. The PrivaTV would need nearly a $7000 markup for the same return.

Obviously these are made up numbers for illustration. I think that for big manufacturers it's not worth it for the return and amount of effort they would need to spend. Maybe a small manufacturer could do it. Maybe that would spur the big guys to buy them out and take it over once the hard work is done.

A TV manufacturer doesn't need to develop that PrivaTV from scratch, they can get their SmartTV and just rip out the Smart part, for a much lower markup.

A big manufacturer is one who'd have it easier; just need to make "privacy" into a selling point, then slap a "Private" sticker instead of the "Smart" one.

Hopefully with the "right to repair", we might see some people ripping out the smarts out of a SmartTV, possibly just flashing an updated firmware, so that might convince manufacturers to give it a go too.

I was specifically talking about what the original commenter said.

with proper support, privacy, customization options and no crap like ads

Dumb TVs are already a thing as mentioned elsewhere. Commercial Displays cost more but you can beat someone to death with them and they'll still work.

I'm with you on hoping for more options. I'd hate for my next TV purchase (hopefully years from now) to be forced online under the guise of firmware updates to steal my viewing habits.

I think you're right mainly in that what they're doing now is sure and easy money. Why risk it, right?

They aren't very good though. They are durable, but usually expensive and missing a lot of features you might actually want for that price tag. For example, I've yet to find any OLED "commercial displays" that support Dolby Vision, VRR, and eARC.

It's way cheaper and easier to just buy the TV you want and not connect it to your wifi.

Computer monitors should work too, and are more readily available. Just dig through the business oriented monitors and ignore the gaming ones, as cable providers aren't really going to have anything that can take advantage of >60 fps display rates.

My personal experience with computer monitors is that they work great except they always seem to cheap out on speakers if they have built in speakers. Tiny, tinny things whose volume is always way too low.

I don't mind having separate speakers, but once in a while it would be nice to not need them.

I don't think I've ever heard what my TV speakers even sound like. I've never used them.

Same, I think I never used them, when I bought my latest TV I already had my good old 5.1 system

Even on a high end TV the speakers are going to be bad. Itā€™s just there to check a box. TVs are so thin that you cannot physically fit in speakers large enough to sound good.

A cheap sound bar will make a huge improvement to audio quality over any built in speaker system.

Right, but at that point, may as well just invest in a fucking PC monitor. Like what else is a TV really bringing to the game that a monitor can't?

Like, if they can't put in speakers worth a damn, that's the point of even including them?

Like what else is a TV really bringing to the game that a monitor canā€™t?

A tuner and a remote control.

Find me a reasonably priced 70ā€ monitor and i will hail you as the next coming of christ. That is the holy grail for me.

Since Iā€™m going to be skipping the TV part with my HTPC, then why not simply use a computer monitor. Nowadays you can also get a 40+ā€ monitor, and that should be big enough for most people. These things might not even have any speakers, so you might need to plug it into an audio system to make it all work.

The other option is to buy the smart TV, turn off the networking, and hook it up to a Shield, Apple TV, or Roku. All those box makers are going to support the devices longer than TV manufacturers, and the streaming apps can't ignore them.

so is using something like an Apple TV or Roku box actually more secure than just using the apps directly on the TV?

Yes, because streaming boxes can be upgraded independently of the TV and so you can always have hardware that's actively supported. My old Roku 3 was still getting updates as of a few years ago, while my "smart" TV from 2015 stopped getting security updates long ago.

Last time I looked for commercial dumb TV, a SHARP was like $4000 for a 65" 1080p or something :-/

However, the other solution is the one youā€™ve already mentioned where you never plug the Smart TV into the internet, and instead bypass the ā€œsmartā€ on the TV with your own streaming boxes.

I did this for a long time on my old Vizio TV, but the experience was notably worse with external devices compared to built-in, due to the limited framerate support over HDMI. This led to awkward juddering when e.g. trying to play 23.976fps movies with only 30hz or 60hz output. It also meant built-in video features like motion interpolation did not work effectively.

I guess this is less of an issue today with VRR support on high-end TVs, but still, a lot of devices you might connect to a TV don't support VRR.

Your streaming box was either not configured properly, or was very low cost.

The most likely solution is that you need to turn on a feature on your streaming box that sets the output refresh rate to match that of the content you are playing. On Apple TVs it is called "match frame rate". I know Rokus and Android TV devices have similar options.

Newer TVs can detect when 24 fps content is being delivered in a 60 hz signal and render it to the panel correctly, but this doesn't usually work if you have the selected input set to any low-latency modes ("Game", "PC", etc)

Good to hear newer devices support this.

My experience was from quite a few years ago (2015ish). At that time, there was no such feature in any of the devices I tried connecting, including a few brands of Android phones, Fire TV sticks, and MacBooks. I remember reading into the documentation on other devices at the time to find something better, with no luck. That said, documentation was pretty poor all around so who knows? The most useful info I found was in threads on VideoHelp or AVS forums where other users reported similar issues on various devices. Android TV was still very new and very shitty back then.

At this point I would simply not buy anything that doesn't support VRR.

This is one of the downsides of the widespread adoption of HDMI, it has quite a few downsides. Something like display port would be better, but it's far less common. Such is life.

How is this a downside of HDMI?

It sounds to me like the user's TV or streaming box are configured incorrectly. DisplayPort doesn't magically remove judder from 24fps content being rendered to a 60hz signal.

DisplayPort never saw widespread adoption in the home theater space because it never tried to. The standard is missing a ton of features that are critical to complex home theater setups but largely useless in a computer/monitor setup. They aren't competing standards, they are built for different applications and their featuresets reflect that.

Newer revisions of HDMI are perfectly good, I think. I was surprised and dismayed by how slow adoption was. I saw so many devices with only HDMI 1.4 support for years after HDMI 2.0 and 2.1 were in production (probably still to this day, even). It's the biggest problem I have with my current display, which I bought in 2019.

GP's problem probably isn't even bandwidth, but rather needs to enable their TV's de-judder feature or configure their streaming box to set the refresh rate to match that of the content being played.

VRR support came with HDMI 2.1.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

VRR is really meant for video games.

You could still have your player device set to a static 24 or 30 without VRR, in theory, but none of the devices I tried (granted, this was ~8 years ago) supported that anyway.

That's interesting. Pretty much every Blu-Ray player should support this. I can confirm from firsthand experience that Apple TV, Roku, and Android TV devices also all support this. I can't speak for Amazon's fire stick thingy though.

The feature you are looking for is not to manually set the refresh rate, but instead for the device to set it automatically based on the framerate of the content being displayed. On Apple TV itā€™s called ā€œmatch frame rateā€.

This is good to know, thank you for the info. I am getting worried about my increasingly old TV (15+ years) and I do not want a smart TV to replace it.

Is it just me or is it really fuckin' easy to not connect your TV to the internet?

I've hated "Smart TVs" for a decade now, but I solved my problem by just buying a set top streaming box (Apple TV, Nvidia Shield, etc) and leaving my TV off my WiFi.

Some smart tvs's will whine incessantly about not having the internet.

Thankfully mine (Philips) only bitched about it for about a week, and gave up. Now the only real complaint I have with it is that it takes forever to boot, considering it has to fire up android after it's been off.

LG doesn't do this. They also have the good sense to allow firmware updates via USB. Which is great, because turning on WiFi long enough to install an update fills the home screen with junk.

I have an LG that is a couple years old. Never connected it to the Internet and donā€™t intend to but have wondered about firmware updates for it. I am afraid of an update adding ads or something else I donā€™t want. What is your experience? Or is there a resource that details everything (and I mean everything) that the updates change?

I've installed two firmware updates on my C1 and they have never added advertisements. I installed them because they both fixed specific bugs I was experiencing with my home theater system.

I don't see why they would try to shove ads in an offline firmware update when it is both easier and more useful to download them from the internet once the device is connected. It's hard to make money from ads when you can't actually track user engagement.

That said I would only bother updating your TV's firmware if there is a bug fix or feature you need from a newer version.

I also have a C1 and have been annoyed that it won't turn on my connected AVR when I turn on the TV even though it has the capability and it turns it off when I turn the TV off. This wouldn't happen to be one of the bugs you upgraded to fix, would it? What bugs did you encounter that you fixed with firmware upgrades?

Not OP, but what you're referring to is called CEC control. Maybe just have a dig through your settings on both devices to check CEC is enabled.

I am aware of the CEC settings and they are working - the TV will power off the device just fine using CEC and it has the ability to power it on (I can manually trigger this) but the TV does not send a power on command to the AVR automatically when the TV is turned on. This seems to be a known issue but I don't have a link to the forum discussion I found a while ago where others have the same problem.

The bugs I was having were related to eARC not working properly when G-Sync was enabled on my PC. I haven't had any problems with my C1 not responding correctly to "one touch play" CEC signals from my PS5 or Apple TV.

Well I'm glad you got them fixed!

"One Touch Play"/power control/CEC settings can sometimes be buried in weird places. If one device in your chain doesn't have it enabled, it can cause bad behavior. There are two different settings I had to make sure were enabled on my Denon AVR before devices plugged into it would turn on the TV. So I would double check in those usual places.

The only CEC related problem I've had with my LG has been that turning everything on with my Apple TV or Nintendo Switch will cause the PS5 to power on as well if it was the last device in use before power-off. I suspect this is actually a Sony issue, as I had it with my PS4 as well, and no other devices respond incorrectly to the one touch play signal like this.

I've heard about this power-on issue but it's not the one I've been experiencing - the only device is the AVR and it can turn on just fine if I pull up the menu and hit Power through that.

A good place to check is avforums or avsforums. There have been a lot of CEC and ARC issues (on all brands!). And a lot of people discussing the different updates (while laughing at the useful release notes)

I personally found that CEC power on only worked when I had the amp input selected, or used ARC.

Is it just me or is it really fuckinā€™ easy to not connect your TV to the internet?

I prefer not to reward corporations by buying equipment with built-in spyware.

(Also, "easy to not connect" depends on whether the TV nags you, or disables features, or uses any open wi-fi it finds, or includes a cellular or mesh modem.)

You're just giving the same companies even more money when you buy their much more expensive "dumb" digital signage products.

Nobody's been able to show me a TV that actually does those other things you suggest. If one did, I wouldn't buy it, but I won't base my current pruchasing decisions on hypothetical future products.

Youā€™re just giving the same companies even more money when you buy their much more expensive ā€œdumbā€ digital signage products.

No, I am not.

(And even if I was, it wouldn't boost the sales numbers of spyware products, encouraging more of the same.)

Nobodyā€™s been able to show me

If you don't want to believe it's a problem, I don't expect anyone wants to waste their time trying to change your mind.

(Jay did report seeing examples in the wild, though.)

No, I am not.

Who do you think makes these digital signage products? They all come from LG, Samsung, Hisense, etc.

If you donā€™t want to believe itā€™s a problem, I donā€™t expect anyone wants to waste their time trying to change your mind.

Show me a TV that ships with a cellular modem or that connects to open wifi networks without being prompted, and I won't buy it. I'm not the one with the burden of proof here. It's very easy to see if a TV does any of this shit before you buy it just by checking reputable review sites like rtings. So telling people any TV they buy at Costco does this is just spreading FUD.

So telling people any TV they buy at Costco does this is just spreading FUD.

Nobody has said that.

This honestly and embarrassingly didn't occur to me.

I got a roku for my smart TV because I wanted something with a Jellyfin app. I don't trust roku any more or less than Vizio, but I find I like the idea of removing internet access to the TV directly.

Yeah they can try and push as much BS as they want but in the end Iā€™m never using their software and always using a shield

Isn't it the Shield that introduced ads on the homescreen since a few years?

Yes, but one benefit of the Shield over other android based devices is that itā€™s pretty hackable. You can replace the launcher and get rid of the ads and while itā€™s more than a single click to do that itā€™s easy enough by following a guide.

On my older shield that had the older launcher without ads I disabled updates for the launcher and reset it to its original version all through the normal settings interface.

I tried doing the same with a FireTV without success, Amazon forces you to use their launcher unless youā€™re willing to put in substantial effort.

Back in 2019 I wanted a nice LED screen with high resistance to screen burn but the only economic option was a Samsung Smart TV.

I actually ended up getting it, ordering a custom mount for the ARM Chip, and using an input method on the chip that makes it run Java natively so that I could make the Smart TV drop it's firmware onto a USB and from there I could modify it, since it was just running a version of Linux.

So that's the story of how I un-smarted my TV. Get fucked, Samsung.

What a fucking ridiculous workaround that's completely unavailable to the regular consumer...fuck Samsung (and the industry in general) for this approach.

Well, now that the community has had a few years to reverse engineer it I assume there are a lot of better and easier ways to replace the firmware with an open source quick fix. So, it's not like my way is the only way. It was just necessary at the time. In fact, the community worked very fast to find a way to hack these "System on a Chip" architectures since the ARM chips were first released. They're used in Macs, phones, TVs etc and have a very high power efficiency, but it is a very clear design choice to make them extremely difficult for the user to access and customize.

Unless it's "load this file on a USB stick and plug in in your TV"-easy, it's still out if reach to most consumers.

But my point is, it shouldn't be necessary to do these things in the first place. Fucking drop the "smart" element from them completely, they always suck ass anyway and are laggy as hell to navigate.

all cars are headed there too.. so very soon you won't be able to get news or transportation without someone else's permission..

Please drink a confirmation mountain dew to activate your air conditioner. Sorry, we didn't hear the sound of the can opening, please open another where the microphone can hear it. Sorry, our servers are down, we apologize for any inconvenience this may be causing. Sorry, getting your family out of the vehicle is an optional feature which you have not subscribed to, please say "Do the Dew" to get temporary access to the friends and family doors. Sorry, I didn't get that, please say "Do the Dew". Sorry, I still didn't get that, please call customer service at 1800-luv-cars to give your ad consumer confirmation phrase to a customer service representative.

I mean you have technically always needed a drivers license to drive which I would classify as "needing someone else's permission."

The bus on the other hand?

I mean you have technically always needed a drivers license to drive

No you don't. They still work even if you don't have one.

Hell, the keys turn in the ignition even if you're drunk.

it sounds like you've already surrendered your right to transportation.. that license is something that i hold myself to and others.. that's the way citizenship works.. i don't ask for permission.. i agree to abide by minimum functional requirements because i expect others to as well, and carry proof.. you are confused..

if you think getting your driver's license is "asking the government for permission" you are already a slave

Smart devices are basically data sniffers scooping up any info about you and your family, your habits,. They watch network traffic, listen to your conversations, and record video,. I'll stick to dumb devices thanks.

3 more...

Sony Bravia models now give you an option to make it a dumb TV as part of the out of the box experience. Itā€™s the first question they ask you when you power it on.

Wait, really? Is this also the case for older models? I have one but it's already a few years old. šŸ¤”

My X950H does not give the option (although there is a hidden dev/ā€œpro modeā€ that allows you to turn it into a dumb screen I think) but my newer A80J model does give the OOTB option to disable the smart features.

The Bravias with Google TV are at the absolute limit of what I will tolerate from a smart TV. Suggestions/tailored stuff on the Home Screen, but no invasive ads. Anything further and Iā€™d turn them into dumb TVs and use an Apple TV or Google TV dongle instead.

I got a display signage TV. Totally dumb. The only app it has is YouTube and that's optional. I don't even have the internet hooked up to it. Works fine for gaming and occasionally streaming via other devices.

Where do you even get something like that?

I got mine through Amazon. Samsung makes the cheapest ones I've found. Just search for something like "samsung commercial TV". They're generally a little more expensive than your ad/data harvesting-supported TVs but if you value your privacy and longevity of your devices, it's worth it.

Also, these industrial monitors have better heat sinking from the LED backlight, which increases power efficiency and service life ā€“ the two metrics their intended buyers care most about.

1 more...
1 more...
1 more...

I have solved this by not buying a TV in the last two decades. I just own projectors. Larger screen, cheaper, no "smart" nonsense. Depending on mounting, essentially invisible when not in use and not a large black rectangle in your living room. Do recommend.

How dark do rooms need to be for them to work? Are there issues with shared spaces where someone might want a well lit workspace?

Having the sun shine through a large window is an issue, but is also an issue for a good picture on normal TVs. Picture quality with protectors is better when the room is darker (increases contrast), but a normally lit room is just fine. It also depends on how and what you're watching. I generally do darken the room when I'm actively watching a movie, but no need for that when putting something on you're just half watching. You can still tell just fine what's going on even in a bright room, it just looks a bit washed out.

It also depends on the brightness/class of the projector of course, and on the screen. Don't underestimate the visual difference a screen makes. Both having any screen over just projecting onto a white wall, and a great screen over a cheap ransom one.

The core issue is that a projector uses throwing light as bright, and not throwing light as dark. If your surface (screen or wall) is rather white and illuminated without the projector actually projecting light into it, that is as dark as a black part of the picture could possibly be. There are screens that are reflective, but more gray than white, those help with that, too.

I would say a normally lit room (with artificial light in the evening for example) is fine to use a projector. "Well lit workspace" really depends on you're definition. For my definition of "well lit" it wouldn't be ideal, but I've just installed like 49000 lumens of illumination into my 3.5 x 3.5 meter workshop, cause I like to see what I'm doing and life is too short for bad lighting.

Thanks, that's a lot to think about. We currently use an oled computer monitor as a TV (hooked up to a pi) and it's beautiful but there are limits on screen size and it's crazy expensive (you're paying for stupid fast refresh rates and the Gamer(TM) markup)

our house is very bright during the day, lots of glass in sunny Australia, so it's probably not a great candidate for a projector generally but it does have me thinking about one in the bedroom for late night movies. Probably a lot cheaper and neater than another absurd monitor.

My last projector came with a low power android TV stick. I thought that was pretty cool, even knowing that I'd never use it. That means the smart TV features are there for the people that want them and can literally be thrown in the garbage for those that don't.

Yep, they're horrible. I always disable internet on them, uninstall any apps I can, and generally do what I can to avoid using the built-in smart TV, but I shouldn't have to do this, its unfortunate and sucks to deal with. They just take advantage of consumers who don't know better, wish the TV market wasn't like this. :/

I'll probably need to buy a new TV in a year or two. I read there are some ways to flash custom firmware on it.

Just get a monitor. The only real difference between a monitor and a TV these says is the lack of a speaker, and "smart" stuff. But TV speakers suck anyways so you'd be better off using a soundbar regardless.

I haven't seen >40" monitors at a reasonable price though compared to TVs

Edit: also, there's usually not any audio output to an amplifier on monitors.

As I mentioned earlier, use a soundbar or dedicated speakers (most TV speakers suck anyways). Also, for a reasonably priced monitors, look for monitors marketed as "commercial displays" - they're generally the same price or even cheaper than a similar spec'd TV.

As I mentioned earlier, use a soundbar or dedicated speakers (most TV speakers suck anyways).

Yes, but there is no audio output (as in a RCA, Optical etc., not built-in speaker) to get the audio from the monitor to the amplifier.

Depends on the monitor, my one has a 3.5mm jack to get analogue audio out of the HDMI input which I use to get audio from my Xbox to the rest of my setup.

None of the >40" monitors I've looked at today had any audio outputs. But finding one that isn't an ultrawide format for gaming is probably the bigger issues it seems.

??? The output is provided by whatever box you're connecting to the monitor - set-top box, Android TV, Apple TV etc.

Not true at all...e.g. Chromecast doesn't have a dedicated audio output and neither does the apple TV, they only have HDMI output. Now the HDMI does also carry audio, but many amps and especially sound bars, do not have HDMI to pass through and rely on getting the audio signal from the TV/monitor if you're using those devices.

There are plenty of HDMI switches or splitters out there that support audio extraction, just use one of them to sit between your monitor. Like this one: https://www.amazon.ca/gp/product/B00XJITK7E

You're completely missing the point...this will add an ugly extra box cluttering up an otherwise clean setup. Of course there is a workaround, there always is, but it's far from optimal. It's a bad solution to an annoying problem that shouldn't exist.

1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...

Can you get monitors in 50 inch these days though? Then this would be my route as well once my current dumb TV dies

Yes, Viewsonic for instance is one company that makes them. Although, they're typically advertised as a "commercial LED display" or something like that. Basically look for "display" instead of "TV".

1 more...
1 more...

Just wanted to say same. I have used a Linux box as my Media Center and Home Server since 2008. Also have a chomecast dongle so I can steam from Android and Android apps. Not sure what else one needs.

Seems to me what one wants wants really is mostly a browser and ability to stream stuff from apps on your phone. Since the Linux box is a Media Center and Server it also has a lot of features a Smart TV would not have. Just do not see the value of a Smart TV.

Pro Tip: Buy a Computer Monitor e.g. 4k 34 inch

they donā€™t have any smart tv shit, but you need to buy some extra for the audio

that's a lot more money for a smaller screen, though. 32" is a big monitor sitting in front of a desk, but a small TV if you're on a couch.

you can buy "gaming" monitors that are 48" big for example https://x-kom.de/653504-gigabyte-aorus-fo48u-475-zoll-4k-gaming-monitor-hdmi-21dpusb-c-120hz-hdr-oled (caution! german website)

or this one: https://www.lg.com/de/monitore/lg-48gq900-b

Idk at which point they are starting to implement some kind of "smart tv"

caution! german website

Is there something scary about German sites that I'm not aware of?

jesus what are those prices

That second one is apparently sub-ms latency, which is incredibly unnecessary for a TV.

If you mean the 0.1ms, that's not latency. That's grey to grey speed which is practically meaningless. The image can't get from your PC to the monitor in 0.1ms.

The fastest monitors I can find have a real latency of 1.7ms, and that's a 1080p monitor running at 360Hz. When you get down into the 4k 120Hz range that the top TVs do, the speed falls to around 5ms, which is about the best you'll get from a TV.

1 more...

48" is still small by many people's standards. My TV isn't particularly large and I think it's a 52"

1 more...
1 more...

Many features that come with TVs these days are not available on monitors. For example "filmmaker mode"

Monitors are effectively always in 'filmmaker mode', as they don't do frame interpolation and colour grading and over-scanning and all the stuff that filmmaker mode disables.

I donā€™t think there are many monitors that support 24fps natively.

A 120hz or 144hz (reasonably common) monitor could do 24FPS with no issues, as that's either every 5th or every 6th frame exactly.

Im not sold in the idea. I did find a couple of OLED 4k 120hz monitors on 'TV size' (BenQ MOBIUZ EX480UZ ), but then they dont seem to have (or at least advertise) hdmi-arc or hdmi-CEC support, and the brightness is only 480 nits (vs 800 of a LG C2) and it seems more expensive.

I would not recommend it as a replacement for an actual TV in a living room.

1 more...

Actually I think a lot of people do upgrade their TVs somewhat often, as stupid as that is. I will be clinging to my dumb TV for dear life as long as possible, but I feel like people are very consumerist these days. TVs have gotten cheap enough that they feel able to.

It goes without saying that no one should have to buy a new TV because there is a bunch of trashy software on it, but I'm sure it's already happened enough to incentivise these assholes' bad behavior

the smartshitification of TVs is annoying, and i too hate being tracked by every device i use. That said, the incredible value of these TVs can't be overstated. Most people can't or won't spend more than $500 on a TV, so most people would still be using 1080p displays if it wasn't for this phenomenon, but now EVERYONE gets to have a 4k TV because the price is partially subsidized by all those ads you're seeing.

I think it's probably a net negative for society overall, but just wanted to point out that there is an upside to all those ads.

I found this comment funny because I'm still using 1080p TVs in the house. Mostly because after we got them I cut the cord and couldn't even remotely stay under the Comcast cap with 4k. Even after I moved on to T-Mobile home internet, uncapped for now, I haven't upgraded because there really hasn't been a pressing need. Hell, cable and streaming don't actually deliver 1080p, it usually is 720p max. I have no idea if the 4k options actually broadcast in 4k vs a smaller resolution.

honestly you're playing it smart. I can't speak to cable, but you are right that streaming apps don't exactly justify the bump in resolution. The image they spit out to your screen is 4K, but the compression required to stream in 4K means that most of the added details are just crushed by the format. So ironically, you don't really benefit from 4K unless you use Blurays, at which point you don't really need any of the "smart" features of your 4K tv.

The compression issues are true for 1080p too, any dark scene on Netflix gets some horrible color banding and artifacts.

Ironically, the pirates don't have that issue as their multi-gig torrents don't have much compression compared to the some-hundred megs stream provided by Netflix

1 more...

Samsung commercial displays don't have any smart features, I think.

They don't come with a stand.

https://www.samsung.com/us/business/displays/4k-uhd/qe-series/qet-series-43-lh43qetelgcxza/

I'd much rather have a VESA mount than a stand. VESA stands are easy to find and cheap, and if you want to wall mount or get a fancier stand that is an option.

The specs on those QET models don't look great. 300 nits max brightness? That can't be right. Aren't they designed to be in brightly lit areas like inside stores?

I just never entered my Wifi details into my smart TV. I only use the HDMI inputs on it anyway, so it behaves like a dumb one. It's a RCA TV from Walmart, if anyone is wondering.

As a lot of people here, I did the same, bought the smart TV, it needed internet for firmware upgrade, and once it had started and did not ask for my inputs or whatever, I selected the HDMI1 as startup, plugged a Chromecast. Then went into the TV menu to forget the network settings on the TV. It's just a monitor used to cast Netflix, Disney, Plex, Prime, etc.

How is plugging in a Chromecast any different than using the same software built into a Google TV?

As the owner of original $500 Sony NSX-32GT1 TV from 2010, I can tell you exactly how different it is. Its UI wasn't exactly a snappy experience to begin with, but it's gotten even more sluggish over the years until everything stopped working due to being EOL. The OS on it has been unsupported since around 2016, so it's stuck on ancient Android TV version. Most apps (even built-in ones like YouTube) stopped working a few year later, and cannot be updated.

Sure, a $50 Chromecast will eventually suffer from the same problems, but I can replace it 10 times for the same amount of money while keeping my TV because its 32-inch 720p panel still displays content just fine.

While true, the context of this discussion was mostly about privacy rather than functionality. Builtin Chromcast vs external changes nothing on that front. There is also nothing that would prevent you from plugging in that fancy new external Chromecast to the old Sony and getting new functionality from it if the display is still to your liking.

I don't know for a Google TV, but others brands have microphone and ads in their builtin ...

I have some practical annoyances, most surrounding multiple remotes and the clunkiness of it. I have two TVs in my house: a Samsung smart TV from 2019 and a Hisense Google TV I purchased earlier this year. The Google interface is not the most responsive, but it packs in all of the stuff I would want. Android is the most supported platform for apps. Samsung's OS has good app support, but open source projects and more niche apps aren't there. I think there is a nebula app now, but for a while there wasn't, for instance. So, I bought one of the Chromecast with Google TV sticks to bridge the gap. It works well most of the time, but unlike the Hisense, it doesn't support airplay. So if someone airplays, you get kicked back to the native OS and have to use the native remote. It's possible to configure the Chromecast to use the native remote, except the home button doesn't map, it is the home menu for the native OS. So it's just kinda clunky. I do think newer Google tvs with airplay built in (varies by brand) are going in the right direction here. If you're concerned about privacy, they're still gonna be a nightmare though.

multiple remotes

I have a Logitech Harmony Ultimate remote with Hub. It was the best thing ever for me, because in my living room setup I have the TV, A/V receiver, Nvidia Shield TV for streaming, and an Xbox Series X for my kids and occasional BR/DVD movies. With programmable activity buttons, a single tap on the remote turns on the appropriate devices, switches them to the correct inputs, and provides appropriate controls. Tap another activity, and it turns on devices that are required, turns off the ones not in use, and switches controls to the proper device.

Unfortunately Logitech discontinued the Harmony line. It's only a matter of time before they take down the servers that host device databases and allow you to reprogram the remotes. I've been looking into replacements, but there aren't many that have feature parity with Harmony.

Seems the best way is with IR blasters and learning remotes. There is software for PCs/Macs that will let you capture the IR remote functions, then you can assign that to a "smart/learning" remote to relay those signals. Not nearly as easy as the Logitech setup, and I'm dreading the day the Harmony remotes stop working.

I'm really happy that I got to skip the whole smart TV stuff by being a projector guy. They are still dumb as rocks for the most part.

My HTPC is an Odroid, so all is well in that department as well.

Thanks for this info. One day I will buy a projector. They have seemed to be superior and cheaper overall than most TVs now. This is just another tick mark in their favor.

How well does this work? Like is glare / not being able to see an issue? How much of the wall color shows through, or do you have to have a special screen?

How well does this work?

Essentially you trade some contrast and color for a much bigger screen size. If you can live with that they are incredible.

Like is glare / not being able to see an issue?

Generally, you want to use a projector in a light controlled room. If you cannot block out most sunlight/ambient light, it might not be feasible in that room.

As you approach more expensive projectors (like 3000/5000$+) you can get a sufficient picture in a bright room but the contrast will be severely lacking - good enough to watch some soccer but not something you want to watch your new 4k blu-ray on. The darkest possible black in a projector picture will always be the remaining ambient light in your room. The darker you can get your light levels, the better.

How much of the wall color shows through, or do you have to have a special screen?

While you can project directly onto the wall I would recommend a projector screen, even a cheap one will do. I project onto a regular 200$ white projector screen and can't complain even after years of usage. Since light produced by the projector bounces of your walls, it's recommended to black out the surroundings of your screen, even painting the wall behind the screen black/gray will immensely increase the contrast of your image.

do you have any screen and/or projector recommendations? i will be moving soon and will more than likely end up selling my smart TV. was thinking to buy another TV but reading your comment has me more inclined to go to the projector side.

In general, https://www.projectorcentral.com/ always has solid up to date recommendations for different price brackets.

Depending on where you can place it, you might want a short throw or ultra short throw projector. Be careful with the ultra short throw projectors ("Laser TV"). Many of them are starting to introduce smart TV features since they are becoming more mainstream. Try to stick with known projector companies like Optoma, BenQ, Sony, Epson and JVC.

As for the screen, don't worry to much about it. Get a white screen without any gain to avoid hot spots or dim pictures. Make sure to max out the possible size for the location, you will regret it later if you don't. If you can, get a screen with a solid aluminium/steel frame and mount it on a wall. Do not bother with cheap pull down screens, they will start to show creases and folds within the first years.

cheers, dude! very solid response and you painted a great picture of what i need overall, i appreciate it. i havenā€™t heard of Optoma or BenQ actually! and iā€™m unsure of what a short/ultra short throw is but will research that. thanks again man, will definitely get on the prowl and start making it happen!

My worst AV mistake was replacing an old CRT with a Samsung "smart" TV.

That thing lost so many features through OTA updates that could not be disabled, to the extent where the boot-up screen that showed all the features just getting emptier as the years went by. I believe now it just shows a TV guide icon and a Miracast icon lol... ended up just switching off the WiFi on it.

I only buy projectors now. Still have one dumb LG tv still kicking though - parents were going to throw out, but I managed to fix a minor issue it had with the LCD panel

I finally looked in the settings and found I can set my roku tv to not start in the smart menu and it is immediately better for me.

I use a 32" 60Hz monitor That's the way to go Level One Techs covered this due to those same reasons Wendel's mom's Smart TV ran out of storage for the OS Updates and some apps upgraded beyond working on the old OS It's like cell phones So, I went with a large cheap monitor on sale and stream from my PC or Roku stick It's basically built-in obsolescence

32" is pretty small for a living room TV. I'd wager most are at least 40 or more.

I'm still rocking the original Shield TV that I got in 2017. It's still as new. They added ads to the UI a while back now, but I just installed a custom launcher. Works great.

One of the best devices I've ever bought. I got it used for 100 bucks in 2018. I Don't even play games on it anymore, but it's still the best device for streaming in the house.

I had the Shield TV Pro 2015 version that I upgraded with an SSD. I wasn't planning on replacing it, but it died a few months ago so I upgraded to the 2019 version.

I'm looking to buy a new TV soon ish and I'm really afraid of ending up with something with a ton of pre installed bloatware, simply because that's the industry standard nowadays. If anyone has any tips for "dumb" TVs in the ~ā‚¬600/$650 price range I'd love to hear them. I have a chromecast for streaming and it works fine, so I'm really just looking to buy a large screen without bloatware, no Internet connection required, etc. That's what my current ~10 year old TV does and tbh I just want the same thing but better picture quality.

What you want to look for is a "digital signage display" or "commercial display" like this one. These are basically just large monitors. They generally have really good display quality because they're intended to make products look good, and near-bulletproof electronics because they're intended to be on all day. Because of this and not being subsidized by bloatware they tend to cost more than smart TVs of the same size.

That's a great tip thank you!

No that's a bad tip you're better off buying a consumer screen (better picture quality and more options) and just never connecting it to the internet

I prefer Roku because it's pretty trivial to block their ads and tracking, and I can add and remove channels and features pretty easily. Now if they started forcing me to have certain channels that would change. Plus you can use it as a dumb TV without connecting it to the Internet, and even set it to use an HDMI in by default so you never see the OS.

set it to use an HDMI in by default so you never see the OS

How long do you figure we have before manufacturers remove this functionality?

There's lots of places like sports bars that send TV signals from a central location and need the TVs to stay on the selected input, so I don't think we'll see it go away entirely. It may move to displays that cater specifically to that use case, however.

I commented above but this part may help:

Thinking more about it using eBayā€™s boolean search options may have helped drill down to good choices.

I too am in the market for a new tv. Iā€™ve had my 60 inch plasma tv from Samsung for going on 12 years. I dropped a pretty penny on it from Best Buy at the time and havenā€™t had any problems with it at all for over a decade!

Only in the last year or so Iā€™ve started having issues with turning it on if it was just turned off. Like itā€™s plasma so Iā€™ve always been super careful of burn in of images. So I press pause on whatever Iā€™m watching on the Roku streamer connected by HDMI. Then I turn off the tv itself. If say Iā€™m running to the bathroom, when I get back and attempt to turn on the tv itā€™ll make the red power light blink randomly several times over and overā€¦ but it wonā€™t initiate the start up of the screenā€¦

Iā€™ve tried lots of thingsā€¦ unplugged the tv and waited about 10-15 seconds and plugged it back in then pressed the power on the remote and itā€™ll usually come back on with the volume all the way back to ā€œ1ā€ which stinksā€¦ but it still ā€œworksā€ā€¦

Iā€™m willing to pay a good amount for a plasma adjacent/similar tv just because of the luck Iā€™ve had with this current tv. It makes more sense to me to invest in a really great tv now, as opposed to replacing several less great tvā€™s when they inevitably break down..

Iā€™ve been looking around, and obviously theyā€™ve stopped producing Plasma tvs nowā€¦ so Iā€™m still searching for a plasma like tv that I wantā€¦

Look at commercial TVs, those used by businesses. Some even come with a RPi slot.

I was never so glad I'd bought a normal telly as the week I spent in a rental with a "smart" one.

I don't think I'll ever need anything like a streaming box or whatever. I'm fine with the computer monitor I have right now, which is a bit wider then an early 2000's CRT. Anything I stream is done through Firefox with NoScript and Origin to block the bullshit. I'm not entirely sure why anyone uses streaming boxes and whatnot when you can just do the same thing, but safer and cheaper on a desktop. Maybe so you can more easily watch stuff with people in the same room? Find a use for that couch? ĀÆ\_(惄)_/ĀÆ

Many streaming services don't offer content at the same quality when played from a PC vs something like an Nvidia Shield, and the 4k upscaler on the Shield 2019 is very good. I can understand if these features aren't things you care about, but if you have a modern 4k OLED and you want to take advantage of 4k HDR Dolby Vision content from various providers, doing that from a PC will prove difficult. Also, my partner is not very tech-savvy and I need to keep the TV usable for them, so running an HTPC is kinda out of the question for me, even if it had feature parity.

I use a very old (circa ~2010), very underpowered (6Gb ram because one slot is dead) optiplex 760 with a DP to HDMI adapter to watch movies and live sports on a projector.

I can watch my pirate streamed movie or sports in one window and do other nerd stuff in another no problem.

Running Linux mint-xfce with Firefox, ublock, and containers.

Internet is acquired by tethering one of my old android phones running dnscrypt-proxy and tor (invisible pro), with KDE connect for remote control.

Agreed. Chromecast and Roku are just as bad as the shit built into the TV. Proprietary boxes of DRM and adware the lot of them. The only thing worth streaming from is a PC or maybe a rooted Android box.

Oh, i dunno... my chromecast for tv is holding up okay, and i don't get advertisements in it's main ui.

My 2013 LG lcd is doing okay although i brole a bunch of the dots. I will have to replace it soon, and there aren't any non smart units available. I guess i'll gave to block the traffic.

An Amazon Fire Stick is far smaller, much quieter, draws less power and is simpler to use than a general-purpose PC.

Plus, if I'm using a PC I'd probably only use Linux, so I'd have to deal with lower quality streams because DRM... so overall the experience would be worse.

Using a more 'normie' Windows box as a streaming box could work, but that doesn't solve the noise(!) and power draw issues, that feels like a compromise rather than a choice.

I've recently bought a Fire Stick and don't regret it one bit. It's doesn't fell janky and doesn't have ads as far as I can tell. The provided remote inclues an IR emitter than can turn the TV on/off and change volume (why isn't this provided by HDMI itself is beyond me), and it's much faster than any smart tv so you can watch content without having to wait

I forget about the DRM bullshit Linux needs to deal with. As GabeN says, paraphrasing "piracy is a service problem."

Stremio + Torrentio + a debrid service is easier and better than bouncing between different online subscription services.

Still using a 14 year old Panasonic plasma TV in the living room and I dread the day that I have to replace it.

I run a DNS server on a raspberry pi and block only those domains where LG gets ads and promoted content from. I can then have a clean and responsive interface while services like Netflix still work.

I recommend you to also make sure to configure your router so no other device but your pihole can make DNS requests. some devices bypass your DHCP settings and use custom DNS servers.

My Samsung Tv from 2013 appreciates this post.

I unfortunately am using a firetv 2nd gen with it and it's horrendous. I also use a xiomi mi stick,which is also horrendous (shitty hardware).

I want dumb but good panels,no smart features please. Does anything like this even exist anymore?

It does, but they're mostly computer monitors. You can use a computer monitor as a TV for streaming media; the catch is most of them don't have internal speakers or ATSC/DVB/etc. tuners for over-the-air TV.

Also, have you tried the Apple TV? It's a set-top box that isn't horrendous, and it only phones home for software updates + optional integration with Apple's cloud services (iCloud). It doesn't spy on its users.

So buying a smart TV and connecting it with streaming box/stick for accessing streaming services does seem to be the best solution.

Most digital signage displays are 'dumb' tvs

yes, but they also tend to have crappy specs compared to 'good' TVs.

Itā€™s all about supply and demand, and the customers have demanded smarter TVs. In this way capitalism drives innovation. Who knows, a couple years from now TVs will likely be twice or three times as smart as they are now!

Who knows, a couple years from now TVs will likely be twice or three times as smart as they are now!

Don't scare me like that!

AI labeled TVs will slowly indoctrinate you into ideologies you would have scoffed at ten years ago. And you won't even notice it.

Not to mention tracking your every move. "Drink a Pepsi to continue watching."

I have a 720p Sony Bravia from 2008. It has a coaxial input. I can still play my retro consoles on it.

I hope it never dies. I'll do whatever I can to keep it for as long as I can. Even if it means paying the equivalent of a new TV in repairs.

My biggest regret was getting rid of a perfectly good portable CRT TV that would have been ideal for pre-7th generation gaming, just as they stopped making good quality CRTs.

I have an LG c1 and the apps and start page dont bother me too much. The remote gives you easy quick launch options and I just immediately boot into my PC anyway and then use the remotes quick launch features to hop into the streaming apps I used to use(because DRM makes browser streaming a hassle especially on linux).

My beef with my smart tv is that basic settings are not hidden behind sub menus under submenus under submenus. Want to blank my screen? Adjust some picture settings a smidge? Audio stuff? Best to google it cause stuff that used to be front and center on my old tvs is either missing or buried deep down.

I dont think they charge extra for the smartness either. A 55 inch 4k tcl tv can be had for under $500, and those have roku cooked in. MY c1 was all about the hardware as well.

Same here. I use my TV as a glorified monitor with a ton of HDMI ports. All smart features are basically non-existant to me. I disable all picture "quality improvement" shit (that typically introduces latency). Everything else is then handled by the attached smart devices that I can exchange or upgrade however I want.

When I look for a new TV, I actually still prefer going to the store, because the one most important aspect for me is input latency. I absolutely hate hitting a button on the remote and then having a delay of a second or so until it actually reacts to it. So this is something I need to try in person: if I hit a button, how fast do I get feedback? If it's not instant, the TV is out of the question.

I am using my TV as a monitor, it starts on HDMI1, my Chromecast. I never use the remote of the TV, it's plugged on my good old 5.1 setup for sound. You are using the TV remote for what?

For picture options and to switch between inputs.

I get the concern with bloatwear but it's never been a concern for me with a TV. I simply use Roku and it's functioned extremely well. I also don't connect my TV to the internet for the most part. It's literally as simple as that. It's not like my TV is running out of storage.

For anyone who doesnā€™t necessarily care about data collection and accepts that itā€™ll happen to some degree and you canā€™t escape it, and wants an easy experience without adā€™s just remember to never connect your TV to the internet and get an Apple TV 4K for $150. The software isnā€™t jank like some android options, it isnā€™t sluggish, and you only need the one remote unless youā€™re changing inputs. I literally never see or interact with the menu of my smart TV and itā€™s great.

Why would you rather have Apple have your data than LG/Sony/Samsung?

Considering that I have an iPhone, yes.

Wouldn't it be better for your privacy if your viewing data goes to another company?

I use the same streaming apps on my phone as well, so they already know what Iā€™m watching.

After hearing about one of the TV manufacturers that would take screen caps and read media file names when playing files from external storage Iā€™ll never connect a smart TV to the internet again.

I'm about to get rid of my ageing "dumb" TV and not replace it. Everything comes in to my laptop now, so any monitor and set of speakers to plug it in to will do.

My prediction is that this is going to be the end of the line for TVs as stand-alone hardware - just like most people don't really have stand-alone Hi-Fi systems any more.

I do it the other way, my laptop and PC are HDMI into giant TVs, and wired to the sound system.

1 more...

I dont know how anyone could put up with ads on their tv ui I actively avoid all platforms that force ads I quit playing xbox years ago when they started putting ads on the dashboard quit watching tv because of ads about 12 years ago. They need us more than we need them.

Well, considering that even the Shield got ads after some updates, I don't think there's escape.

@Templa

A complete shame. I installed a custom launcher (wolf launcher) and it got rid of all ads. Clean screen again.

apple tv

Is apple TV able to run emulators and steam link?

Steam Link: yes; there's an app for that.

Emulators: the tvOS App Store doesn't allow them, but you can weasel around that if you have a Mac, and install the developer tools, so you can build and deploy apps directly to the TV. Emulators for iOS will also work on tvOS, unless they depend on WebKit to run.

I have an old TV on it's last legs. Insignia makes dumb TVs. Does anyone have experience with those? I haven't had luck with Insignia in the past.

You can't find a decent quality standard TV? Why?

Have you tried in the past few years? They basically don't exist anymore in the consumer space.

You can find fairly large computer monitors, and some monitors have built in speakers. That's about the best you can do.

Yes I have and I can go to my local supermarket and get a conventional LCD with no "smart" function.

I won't be buying a brand name most probably but I can buy a conventional LCD tv.

I am using a Sony TV that is not connected to the Internet and i stream stuff using a PS4. TV does not complain. Does anyone know if PlayStation is analyzing the data the same way Samsung does for example? So far there is no adds in PlayStation.

Not anymore. The PS3 would tell everyone on the PSN what you were doing at the time, including running streaming apps. They put a stop to that, and improved the PSN ID privacy settings, on the PS4 and PS5.

Smart TVs in general suck but I'm getting a lot of mileage of my TCL Google TV, firmly in spite of what is intended, by sideloading apps.

Privacy and security-wise it's probably terrible but I'm just tired of having to think about it.

So my wife uses some niche TV apps like weverse, which is available on Samsung TVs. Which of the media players (chromecast, appletv, roku, firestick etc.) have the most number of apps covered?

I have a Chromecast (/Google TV, their branding is a mess) and Apple TV, and have used Rokus family members have. In my experience chromecast has the best app selection, it's just running a modified version of android. It does require specific tv versions of apps, but the standards are as low as the rest of the google play store for better and worse, so anything you can get on an android phone that you'd want on a tv is probably on there. It also (with a bit of trouble) allows sideloading, and I've put some github projects on there like SmartTubeNext (ad-free youtube with a better UI).

An android based device does makes sense. Thanks! I'll look into getting a chromecast.