<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>media optimization &#8211; Gig City Geek</title>
	<atom:link href="https://gigcitygeek.com/tag/media-optimization/feed/" rel="self" type="application/rss+xml" />
	<link>https://gigcitygeek.com</link>
	<description></description>
	<lastBuildDate>Fri, 01 May 2026 14:28:54 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Shrinking Your Media Library: The Robot Solution</title>
		<link>https://gigcitygeek.com/2026/05/01/automated-media-optimization-plex-docker/</link>
					<comments>https://gigcitygeek.com/2026/05/01/automated-media-optimization-plex-docker/#respond</comments>
		
		<dc:creator><![CDATA[Laronski]]></dc:creator>
		<pubDate>Fri, 01 May 2026 13:00:00 +0000</pubDate>
				<category><![CDATA[Software]]></category>
		<category><![CDATA[automation]]></category>
		<category><![CDATA[backups]]></category>
		<category><![CDATA[docker]]></category>
		<category><![CDATA[ffmpeg]]></category>
		<category><![CDATA[file size]]></category>
		<category><![CDATA[library]]></category>
		<category><![CDATA[media optimization]]></category>
		<category><![CDATA[plex]]></category>
		<category><![CDATA[x264]]></category>
		<category><![CDATA[x265]]></category>
		<guid isPermaLink="false">https://gigcitygeek.com/?p=3738</guid>

					<description><![CDATA[Struggling with a bloated media library? Learn how automation can quietly reduce file sizes without sacrificing quality. A workflow scanning and optimizing f...]]></description>
										<content:encoded><![CDATA[<div>
<div>
<div>
<div>
<p>Just last month I was sitting around, <a href="https://www.plex.tv/" target="_blank" rel="noopener noreferrer">Plex</a> on one monitor and <a href="https://www.docker.com/" target="_blank" rel="noopener noreferrer">Docker</a> stats on the other, wondering how a “big enough” array suddenly felt cramped. Every show looked fine, but backups dragged, the disks were noisy, and my wife had already shut down the “I’ll just buy another drive” idea. Underneath all the dashboards, the problem was boring: I was stockpiling bloated <a href="https://en.wikipedia.org/wiki/H.264" target="_blank" rel="noopener noreferrer">x264</a> files.</p>
<p>I was not interested in hand writing <a href="https://ffmpeg.org/" target="_blank" rel="noopener noreferrer">ffmpeg</a> commands at midnight or rebuilding my whole stack.</p>
<p>I just wanted something that would quietly make the files smaller without anyone in the house noticing a quality drop. There&#8217;s was in a post on Reddit; .</p>
<p>For us, that has been a clear net positive.</p>
<h4>Why I let a robot touch my files</h4>
<p>In my house, my son’s 4K anime habit and my wife’s favorite comfort shows are sacred. If a tool replaces those files, it cannot break playback, wreck subtitles, or trash the one iconic scene in each episode. The bar for <a href="https://en.wikipedia.org/wiki/Automation" target="_blank" rel="noopener noreferrer">automation</a> is very high.</p>
<p>What won me over was a workflow built around scanning first and touching files second. It walks the library with <a href="https://ffmpeg.org/ffprobe.html" target="_blank" rel="noopener noreferrer">ffprobe</a>, looks at codec, resolution, and bitrate, and only then decides if a file is even worth a shot at <a href="https://en.wikipedia.org/wiki/H.265" target="_blank" rel="noopener noreferrer">x265</a>. When it does encode, it can score the result with <a href="https://netflix.github.io/vmaf/" target="_blank" rel="noopener noreferrer">VMAF</a> and simply discard any output that does not meet a minimum quality threshold, leaving the original in place.</p>
<p>Having that kind of safety net makes “let it run in the background” feel sane instead of reckless.</p>
<h4>Reencode or redownload: the kitchen table math</h4>
<p>The obvious question in my house was cost. If you have thousands of files, is it wasteful to chew power and GPU time reencoding them all, instead of just redownloading <a href="https://en.wikipedia.org/wiki/H.265" target="_blank" rel="noopener noreferrer">HEVC</a> releases and calling it a day?</p>
<p>For new content, I lean hard toward native H.265. <a href="https://sonarr.tv/" target="_blank" rel="noopener noreferrer">Sonarr</a> and Radarr are tuned to prefer HEVC so fresh downloads usually land in the right format. My wife never hears the word codec; she only notices that things start quickly and stream smoothly.</p>
<p>For the giant pile already sitting under <code>/media</code>, the math reverses.</p>
<p>I already paid the bandwidth and indexer cost to get those files. Redownloading terabytes would hit caps, risk worse encodes, and still leave me juggling replacements in Plex and Jellyfin. Letting a tool reencode locally, with VMAF and “no savings” detection as a gate, turns it into a one time CPU or GPU bill that often cuts file sizes by half or more. In my house, that beats buying yet another drive and stretching backups even further.</p>
<p>So I download HEVC going forward, and I reencode the backlog I already trust.</p>
<h4>Quiet gains in the background</h4>
<p>The part I appreciate day to day is how library aware the process feels. It pulls in TMDB metadata, understands native language versus dubs, and can strip commentary tracks while keeping the few languages my wife and I actually need. A lot of the time it just remuxes audio and subtitles without touching the video, so the job is fast and lossless.</p>
<p>On my server, I keep a couple of CPU jobs running during the day, then let the GPU open up overnight when nobody is watching.</p>
<p>From the family’s perspective, nothing has changed except that Plex feels snappier and I complain less about disk space.</p>
<h4>Where this can still bite you</h4>
<p>There are real tradeoffs. You need to be comfortable with Docker, <a href="https://docs.docker.com/engine/reference/run/#path-mapping" target="_blank" rel="noopener noreferrer">path mappings</a>, and the idea that one app has permission to rewrite your media. If your library lives on flaky <a href="https://en.wikipedia.org/wiki/Network_File_System" target="_blank" rel="noopener noreferrer">NFS</a> or <a href="https://en.wikipedia.org/wiki/Server_Message_Block" target="_blank" rel="noopener noreferrer">SMB</a> shares, you will occasionally be reading logs instead of relaxing on the couch.</p>
<p>The saving grace is paranoia. Originals can be kept in a backup folder for days, every output is verified before the source is touched, and files that do not get smaller enough are simply skipped and marked as ignored.</p>
<p>That mix of caution and automation is what finally let me shrink the library under my desk without my wife or my son ever realizing anything changed.</p>
</div>
</div>
</div>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://gigcitygeek.com/2026/05/01/automated-media-optimization-plex-docker/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
