Although it says "4K TV", as far as I know, there's (virtually) no actual TV broadcast in 4K, so what I'm really talking about here is the state of home theater technology as of around 2018. This post is about the hardware and the formats. Part 2 will be about 4K content and whether the 4K thing is worth the money.
First, let's have a look at HD TV and blu-ray and then see what's been added on top of that. HD can either be a resolution of 1280x720 at 24 - 60 frames per second (720p) or 1920x1080 at 24 - 30 FPS (1080i or 1080p). The contrast ratio or dynamic range that HD supports as well as the range of colors is basically the same as standard definition (SD) digital video and analog video before it.
4K increases the resolution to 3840x2160, or 2160p. The framerates are from 24 (actually often 23.98 for historical reasons) to 60. But along with the (almost) 4K resolution, there's two other benefits: high dynamic range (HDR) and wide color. HDR means that the TV or projector can more accurately show very dark and very bright details. Usually that means that bright highlights can be much brighter, which looks more natural—if it's not overdone. Wide color means that the TV or projector can show more saturated colors. For instance, a scene in red emergency light is really red. Of course normal colors still look normal.
To watch 4K content with HDR and wide color, you need a 4K TV or a 4K projector. (Whenever I write "TV", read that as "TV or projector".) The 4K part is easy: either the TV supports 4K or it doesn't. As far as I know, all 4K TVs support wide color to a reasonable degree, but there's definitely differences in this area. The biggest difference is with the high dynamic range, HDR. I have an OLED TV, which has pretty good HDR. LCD TVs have typically struggled with their contrast ratios, which means that if there's something really bright on screen, you can't also have inky blacks at the same time. But modern LCD TVs have improved in this area. Unless I'm mistaken, 4K or higher resolution computer monitors with a reasonable price won't display HDR. They may or may not support wide color, such as the P3 color space with Apple's latest laptops and iMacs.
Note that at regular viewing distances, you're not going to see the difference between HD and 4K on a TV. Even with a pretty big TV, you need to sit relatively close.
Virtually all TVs these days are smart TVs, which means that they can show services such as Netflix and Youtube, and perhaps Amazon Prime Video, without any additional hardware. I have an Apple TV 4K that will show Apple's content as well as stuff in the iTunes library on my computer. However, Apple has made deals with TV makers, so it looks like with newer TVs you won't need an Apple TV box to show movies you buy from Apple or Apple's upcoming TV service.
Although streaming 4K content has many advantages, those little shiny discs can't be beat. I absolutely recommend getting an UHD (ultra high defintion) blu-ray player such as the Sony UBP-X700.
You can hook up the 4K streaming box and/or UHD blu-ray player directly to the 4K TV. Be sure to get HDMI cables that support 18 Gbps. Those will work without trouble with 4K HDR video at 60 FPS. Regular cables are rated for 10.2 Gbps and won't work reliably at the highest resolutions and framerates. The 18 Gbps cables should be certified as "Premium High Speed HDMI Cable", but in my experience, it's very hard to be sure what you're getting.
An issue that I've been having is that sometimes after changing resolutions, I find myself looking at a screen full of static. I assume that has something to do with the HDMI cables, but it's hard to be sure, and I don't want to pull another HDMI cable through the wall, so I just live with it and change inputs on the TV a few times to clear up the issue.
Of course if you use the TV to play content directly, or hook up a streaming box or UHD blu-ray player to the TV, the audio will come from the TV speakers. That's fine, but who wants to settle for fine in these days of eight-or-more channel lossless surround sound? So that means hooking up a surround receiver.
I got an Onkyo TX-NR686, which supports 7.1 surround or 5.1.2 Dolby Atmos and DTS:X. "5.1" means speakers left and right of the screen (the front speakers), a center speaker that's mostly used for dialog, and left and right surround speakers left and right (and often a bit behind) the viewer. The .1 is a subwoofer or "low frequency effects channel" that handles all the very low frequencies (explosions!) for which we can't tell from which direction they come anyway. 7.1 is 5.1 with two extra speakers behind the viewer.
Dolby Atmos and DTS:X add two things to earlier surround sound: object-based sound and height speakers. The .2 in 5.1.2 is the number of height speakers. The height speakers are often used in action scenes for stuff that comes flying overhead. Object-based sound means that rather than sending audio to one or more of the speakers, sounds can be placed at any location around the viewer, and then the surround decoder figures out which speakers to send that sound to, based on the available speakers in the current setup.
Although it's fairly subtle, Dolby Atmos (I don't have any content with DTS:X yet) can more realistically place sounds in different places than what you get with regular surround sound.
Originally, I had my height speakers close to the ceiling on the wall behind me. That didn't work too well for two reasons. First, the speakers pointed forward rather than downward, so not much of their output reached me. Second, there's much more sound placed up high in front rather than in the back. Once I moved those speakers to front high I noticed much more height action.
Of course it would be even better to have a 7.1.4 setup, with speakers behind me and front and back height speakers. But I don't really have room for the extra speakers and the surround receivers get more expensive quickly as you add more channels.
Unless I'm mistaken, all 4K TVs and players support HDR and wide color using the HDR10 format. So you get that more or less automatically. On top of that, many devices support Dolby Vision. Dolby Vision adds extra information that allows for even better dynamic range and color. So you'll want to enable Dolby Vision on your players if your TV supports it. Currently, you're fine without Dolby Vision, but I suspect in a few years content creators will assume everyone has Dolby Vision and no longer spend the time to make things look good without it.
An advantage of Dolby Vision is that the player can instruct the TV exactly how to handle color and dynamic range, so even content that's not in Dolby Vision tends to look more accurate. So what I like to do is set the Apple TV to 4K Dolby Vision at 60 FPS, and then adjust the frame rate to that of the content, but not adjust the dynamic range. So the Apple TV converts everything to Dolby Vision. There doesn't seem to be any downside to that.
When playing movies and TV shows that are filmed with 24 frames per second, the Apple TV switches to 24 FPS. (Which takes a few seconds.) That's not really necessary, as the TV will display 24 FPS content that's transmitted at 60 FPS without problems. But European content is in 25 FPS, and 25 FPS converted to 60 FPS looks pretty bad: smooth movement now has a hick-up every second or so.
Note Apple TV apps need to enable adjusting the framerate. The built-in Apple apps do this, as does the Netflix app, but not Youtube or Amazon Prime Video.
If you like to create, rip or download your own content and play it from a computer or USB drive, it's important to understand the different codecs and file formats.
4K content is usually encoded with the HEVC / H.265 codec. The color profile is then usually BT.2020, but sometimes this can be messed up and you need to use a tool to set it correctly, or the video will look extremely flat. The most common file formats are MP4 and MKV. Most devices can read both, but Apple only does MP4.
Things get more complex for audio:
AAC. This is a good option when encoding audio yourself. As far as I know, players always decode AAC themselves and send it as PCM (uncompressed audio) to a receiver or TV.
Dolby Digital (AC3). This is a pretty old codec by now, so it's not as efficient as it could be. However, everything supports it.
Dolby Digital Plus (E-AC3). An improved version of Dolby Digital. If you want to use this, check if all your devices support it. E-AC3 can support Dolby Atmos, and E-AC3 with Dolby Atmos can be put in an MP4 file, which can be streamed to the Apple TV.
Dolby True HD. This is a common format on blu-ray and UHD blu-ray. It's lossless, and gets pretty big. Dolby True HD can support Dolby Atmos. It can only be put in MKV files, not MP4. I haven't been able to play MKV files with Dolby True HD from a USB drive on my setup. So that means if I rip an UHD blu-ray title I'll lose the Dolby Atmos. (And the Dolby Vision, but I still have the HDR.)
DTS-HD Master Audio. Also a common format on blu-ray and UHD blu-ray. Also lossless and thus big. Has support for DTS:X object-based audio. I don't know about compatibility.
A TV can send 2-channel PCM and AC3 / E-AC3 (optionally with Dolby Atmos) to a receiver using the HDMI audio return channel (ARC) or a TOSLINK cable. This is what happens when watching TV, using a streaming app directly on the TV or when you've hooked up a player device to the TV rather than the receiver. Dolby True HD and DTS-HD Master Audio can't be sent from a TV to a receiver (with HDMI 2.0 or older) because they use too much bandwidth.