Through my journey of working on computers, I've taken photos of the things I've worked on. In late 2021, I started learning to use Photoshop, and my photography got a lot better from that point onwards. Here's a collection of my exploits over a journey of learning, exploration, and service at The IT Club. - Ethan Z, Founder of The IT Club
This absolutely glorious piece of machined copper is a replacement IHS, or integrated heat spreader, for certain Intel LGA115X CPUs.
CNC machining is a time vs. quality balance--if you're putting a machined IHS on each of the tens of millions of desktop CPUs you spit out each month, they'll be of acceptable quality and cheap to make, but not nearly as good as an aftermarket IHS. These are typically only sold in very small quantities to the most extreme enthusiasts, so the manufacturer can spend a lot of time and effort making it the best they can.
Don't worry about the machining marks--despite the appearance, both sides are almost perfectly flat.
A partial collection of both my and The IT Club's unquestionably retro CPUs!
Sometimes, the computers donated to The IT Club are ancient. Far too old to even be refurbished; even their theoretical peak performance under ideal circumstances is several orders of magnitudes short for Windows 7. However, that doesn't mean that donating them is for naught! We preserve historically-old systems and components like these, preventing them from being scrapped for precious metals. This also has the dual purpose of preventing environmental contamination, since older hardware like what's shown here has far more hazardous chemicals than modern hardware.
Among the CPUs pictured here are Coppermine Pentium IIIs (blue-and-green squares), Pentium Pros (the gold things), pre- and post-MMX Pentiums, K6-2 and K6-IIIs, Pentium IIs, and the i486SX, the oldest CPU in this group.
Pre- and post-MMX
A new CPU instruction set, Intel introduced MMX to consumer Pentium CPUs with a massive publicity storm, touting the magic of MMX. In reality, according to the absolute museum of a site Redhill, "the MMX extensions made no performance difference in real life, but the sum of all the other changes to the MMX made the 166 about 10% faster than a Pentium Classic [non-MMX] 166, 5% faster than a Pentium Classic 200, or 1% faster than a 6x86 200 Classic. No matter which of these you compared it to, it was a great performer". (Coincidentally, the CPU on the right is the exact model of 166 MHz Pentium MMX described by Redhill.)
Of note, the Pentium MMX on the right also features a metal heatspreader to transfer heat more efficiently to a heatsink and fan, thus making the CPU easier to cool.
Before Apple used Intel CPUs (and now their own ARM-based silicon), they used IBM PowerPC processors. While initially providing a fair performance advantage over Intel's and AMD's competing x86-based offerings, later generations, such as the G5 PowerPC 970FX pictured here, were quite hot and power-hungry. As this kept Apple from making laptops with long battery life, this prompted them to switch to Intel.
This particular CPU is mounted on its own little card, and pulled from a G5-based Power Mac. Unfortunately, coolant from Apple's poorly-designed watercooling solution spilled onto its sister card (not pictured), resulting in devastating corrosion.
Ahh, OCZ. A fallen legend of the computer-parts industry. OCZ was known for being big, bold, and utterly ridiculous with the things they sold. With an emphasis on the seekers of the best-of-the-best, they sold things like some of the first consumer PCIe SSDs, GPUs, binned (specially selected for performance traits based on manufacturing variation) CPUs and GPUs, power supplies, and this here kit of DDR2 running at DDR3 speeds and exceptionally tight timings, a rare find. Most importantly, OCZ pioneered the solid-state drive to consumers, playing a significant reason as to why most consumer desktops and laptops use SSDs today. While OCZ itself is gone, the brand lives on under Toshiba, who bought them after a host of legal issues.
Quick-disconnects and other fittings
These small pieces of brass and steel are crucial components for my upcoming liquid-cooled computer. They help connect tubing to other components like waterblocks (which remove heat from electronic components), radiators (dissipate heat into the surrounding air), and pumps. I've chosen right-angle fittings because, at the angles at which tubing connects to other components, right-angles minimize bends and therefore tube flattening and kinking. This is crucial to maintaining coolant flow and preventing overheating, as my desktop is capable of dumping nearly 1,000 watts into the cooling system, continuously.
The fan with a Napoleon complex
I genuinely fear this fan.
At 40mm in size, it's fairly small, good for use in thin, powerful 1U servers. However, it draws 0.75A (compared to 0.06 for a similarly-sized 40mm Noctua fan), spins at 23,000 RPM, and eats fingers for breakfast. It is capable of lifting its own weight. It sounds like a jet taking off at full RPM. It scares me.
This glorious beast was Intel's 32-bit flagship back in 1994 or so. To put this in context, 16-bit operating systems and programs were all the norm, so just its monstrous size, its odd architecture also set it apart from all the other CPUs on the market. While it was slower in 16-bit workloads than a standard Pentium (non-Pro), one of which is pictured elsewhere on this page, its 32-bit performance and insane amount of cache (up to 1MB! For a CPU released in 1995!) made it an excellent CPU for workstations and servers. While it's far too slow for any modern task, it looks absolutely glorious, and I wouldn't be the least bit surprised if it still functions.
This is the waterblock going into my new computer, made by Optimus Cooling. It helps wick away heat from the CPU it sits on by transferring it into the coolant that'll flow through it through the tiny CNC-machined fins, ensuring that the CPU will run nice and cool.
Having a cool, quiet system is important for 24/7 operation, which I plan to be using my new desktop for. Besides being used as a workstation, it's also going to be running Folding@Home 24/7 for our sister nonprofit, Folding@SiliconValley, to contribute to groundbreaking medical research.
This is Erasable-Programmable Read-Only Memory. Data is electrically written to it once, and then the contents will remain like that indefinitely. However, if one wants to erase an EPROM, then simply exposing the silicon die under the quartz window to UV light for a few minutes will erase its contents. That's why EPROMs are typically covered by some sort of opaque tape when in operation to prevent accidental erasure. (The piece of aluminum tape on this particular EPROM was replaced after photos were taken.)
Early AMD chip
Back before AMD made the CPUs, GPUs, and FPGAs that we know and love today, they made a bunch of other silicon! For instance, this PROM chip. It's nice to see how their logo hasn't changed ever since they were founded.
AMD + Intel
Before Intel and AMD got to work trying to disembowel each other, they actually collaborated a fair bit (like how Intel licensed Socket 7 to AMD). This chip is one such example.
Hard drive controller
This ancient thing was found on the Fujitsu hard drive mentioned later in this page! It gets hot enough that I plan to put a heatsink on it soon. But then again, the whole hard drive consumes a whopping 40 watts at idle.
Huge retro hard drive
This is the Fujitsu M2249SA, which is 33 years old as of 2022. This thing stores up to 200MB, spins at 3600 RPM, and features 16MB of cache. It is insanely large and heavy, being a 5.25" full-height drive (three times thicker than even the Quantum Bigfoot mentioned later). It also draws so much power that some parts of the controller board need to be cooled, using the metal drive chassis as a heatsink.
The back is populated with quite a few chips.
The M2249SA dwarfs every other drive I have.
The main competitor to the hot-running Pentium 4, the Athlon XP line is quite a spectacle. Instead of using an Integrated Heat Spreader (IHS) like the Pentium 4, AMD expected you to mount the cooler on the bare die (hence the four foam supports on the corners). You had to be careful to avoid cracking or chipping the die. Despite the mechanical flaws, Athlon XP was a venerable chip in its time.
AMD shipped quite a few of these CPUs as locked parts, meaning you couldn't adjust clockspeeds or voltages, and hence no overclocking. However, people quickly discovered that shorting out all the connection points on the package labelled "L1" with a pencil would circumvent this restriction.
AMD's Duron was the budget-oriented version of the Athlon. It had a smaller L2 cache, but was otherwise quite a compelling value buy.
This red input device has been a distinguishing feature of Thinkpads for decades. While I personally prefer using the trackpad, the Trackpoint is especially useful for people wearing gloves, which renders a trackpad or capacitive touchscreen ineffective. Trackpoints, generically known as a pointing stick (or a C-stick on certain 3DS systems), use small strain sensors to measure the change in resistance of a material as it's squashed and stretched during use to discern mouse inputs.
The IT Club services quite a few Thinkpads. These are usually Ivy Bridge and older, so they're the generations from when they used to be noticeably better than many other laptops. Cost-cutting and emphasizing form over function in recent generations means that Thinkpads are still decent laptops, but there are usually better options out there.
This particular Trackpoint is from my pet-project Thinkpad T400, which I found in an e-waste bin. It must have been used in docked mode for its entire service life, because the entire laptop is in factory condition.
This later example is from a Thinkpad T430s. While the Trackpoint itself is identical, the keyboard has been redesigned to a version with less key travel. While I'm fine with both keyboards, I (and quite a few Thinkpad enthusiasts) prefer the older keyboard more due to its feel and layout. On the standard T430, it's actually possible to swap out its keyboard with a T420 keyboard, which uses the older design.
Intel Extreme Edition
This physically huge CPU is part of Intel's Extreme Edition lineup, featuring some of their best CPUs available to consumers, usually at a pretty hefty price premium. Due to its age, this 5930K is actually a bit slower than the overclocked Ryzen 7 PRO 1700 in my desktop. However, where it shines is with its whopping forty PCIe lanes, which makes it a very good hardware tester (I can check if multiple expansion cards are working at once). For context, most graphics cards use 8 or 16 PCIe lanes.
These are a handful of high-end DDR4-3200 ECC RDIMMs, used in workstations and servers. Registered memory differs from unbuffered memory (UDIMMs) in that it has a register chip (which is the largest chip under the right edge of the sticker in the photo). The CPU's memory controller communicates with the register instead of with the memory chips themselves, allowing more memory chips to be connected (at the expense of slightly higher latency due to the "middleman" register chip).
However, it's important to note that ECC UDIMMs and RDIMMs can't be used on every system. The CPU and motherboard need to be designed for RDIMMs (and have the feature enabled). AMD EPYC and Intel Xeon Platinum CPUs are a common pairing for this type of memory.
Also, high-capacity RDIMMs can get HOT. Usually, consumer-grade UDIMMs get just a little warm due to their lower capacities and lack of extra circuitry required for ECC and register functionality. In contrast, these bad boys can and will absolutely overheat in the absence of proper airflow (they were meant to be used in a high-airflow workstation/server environment, after all).
This is a rarer example of the heatsinks I put into computers I work on. In this case, the heatsink fins are skived, meaning that a blade cut into a solid copper block at a slight angle and tilted up repeatedly to make the fins. Since the heatsink base and fins are a single piece of metal, heat transfer is more efficient. These copper heatsinks are considerably more expensive than their aluminum counterparts, so we only use them on particularly hot-running components where every bit of cooling potential counts.
CPU backside and socket
This is the underside of one of Intel's LGA1155 CPUs. The gold contacts make contact with the corresponding pins in the socket in the image below, and the components in the center are mainly capacitors, which help reduce fluctuations in voltage (ripple) and provide "cleaner" power to the CPU die itself, enabling better stability at higher frequencies.
This is an engineering sample of an Intel Xeon E5-2699 V4, a 22-core CPU released in 2016 for high-end servers. This is one of the fastest CPUs released for its socket (LGA2011-3), the only faster chip being the 2699A V4 (it has a slightly higher base clock).
With 22 cores and 44 threads, this CPU is an excellent fit for the compute and virtualization duties I plan to use it for. In my homelab, it's going to have access to 128GB of registered ECC DDR4 as a great way for me to test out new software configurations for The IT Club. (I could test out up to eleven separate dual-core OS installs at once!)
Also, make sure you know the risks when getting an engineering sample (ES) or qualification sample (QS) CPU. You'll clearly know when you've got an ES or QS CPU by the words "Intel Confidential", whereas production models would have the actual model name and number. While QS CPUs are oftentimes functionally identical to their production counterparts, some features in ES CPUs may...not be quite finished yet in earlier core steppings (aka revisions). These variations can range from an unlocked multiplier to unconfirmed firmware support to feature sets that are better or worse than the final version.
This is a close-up of a typical Intel LGA115X stock cooler. The aluminum part is made by extrusion--forcing a chunk of aluminum through a heatsink-shaped mold and slicing it like a cucumber repeatedly, with each slice forming one heatsink. Extrusion is a great way to make a lot of heatsinks quickly and cheaply, though other approaches are often more effective. This particular example has a copper slug in the middle which helps to transfer heat away from the CPU more quickly, but Intel omitted this slug in favor of a full-aluminum design in later revisions to save costs.
This is a Pentium MMX-166, a CPU from the mid-90s. It's far too old to run any modern application, but it's still a nice blast from the past, and a cool little keepsake. I dug this particular example out of an e-waste bin. Shame that they're being thrown away; they're such beautiful chips.
By the way, even if you think your computer is too old to donate, rest assured it's not: even extremely old computers are useful for parts that help repair and upgraded newer devices. Anything that's left over is properly restored and preserved.
The IT Club buys many types of these tiny heatsinks en masse to stick onto tiny but hot-running components like MOSFETs and chipsets. The sides of this particular heatsink are less than a centimeter wide, but even such a tiny heatsink can play an instrumental role in preventing throttling and enhancing device lifespan.
I guess this is basically Chia farming.
In seriousness, this was a dying 1TB Seagate HDD that I took apart and photoshopped a dime onto. The orange plastic bit that the heads touch is an unload ramp, a feature that all modern laptop hard drives and most desktop hard drive have which increases shock resistance (older drives let the heads rest on a "landing zone" on the platters). I say "most" desktop hard drives because current-generation 500GB-1TB 3.5" Seagate Barracudas (with the green-accented labels) still use the older landing-zone technology.
Motherboard backside cooling
This is the back side of the motherboard of my desktop. The VRMs on this motherboard, an Asrock AB350M, run quite hot, so I resorted to more extreme measures to ensure that they remained cool. Since quite a bit of heat is dissipated through the PCB, I've mounted heatsinks, a fan, and ducting on the back of the VRM area to maximize heat transfer. Air is sucked into the fan between a centimeter-wide gap between the intake and the side panel and flows through the heatsinks, around the other side of the motherboard to cool the VRM heatsinks on the other side, and exits through the CPU cooler, a top-mounted MasterLiquid 240.
I first built this thing in 2018, and as I grow, the desktop grows with me too, as I frequently upgrade it. In terms of cooling modifications, this is actually Revision 4. Here's a (probably incomplete) dossier of other upgrades:
CPU: Ryzen 5 2600 (6x 3.5 GHz) → Ryzen 7 PRO 1700 (8x 3.8 GHz)
GPU: EVGA 1060 SC 3 GB → Gigabyte Aorus 1660 Ti → Sapphire Nitro+ 5700 XT that I used in a research project → Nvidia Titan Xp
HDD: WD Se 3TB → WD Se 5TB → HGST Ultrastar He10 10TB
SSD: Unreleased 128GB WD SiliconDrive prototyope → 256GB Silicon Power P34A80
This is a typology of a small sample of the CPUs (and a few other pieces of silicon) that are in either my or The IT Club's possession. They come in all ages, too. Some chips here, like the Pentium M and PowerPC 750CE at the top-left and top-right, are far too old for any modern task and are simply part of my vintage CPU collection, while others, like the i7-5820K near the lower-left corner and Ryzen 5 2600 near the upper-right corner, are waiting for boards to go into so they can go help underserved students.
This is the VRM and chipset heatsink of of a Gigabyte X99-UD3P motherboard. To cut costs, Gigabyte actually made the heatsink in multiple parts, but they forgot to add some sort of thermal interface material between each part to fill in air gaps and efficiently transfer heat. Fortunately, a bit of Kryonaut in between the two surfaces took care of things.
Despite their appearance, these CPUs are perfectly fine! This is a procedure known as delidding, where the metal integrated heat spreader (IHS) of the CPU is removed to replace the thermally-conductive material between the silicon die and the IHS with something more thermally conductive. Since many (but not all) Intel CPUs use a very cheap thermal paste, this procedure can lower temperatures by up to 20 degrees, potentially quadrupling the lifetime of the component. Delidding is typically an enthusiast procedure only done on ultra-high-end desktops, but we delid the CPU of every single applicable desktop that we refurbish. (Soldered CPUs do not need to be delidded.) While this may seem unnecessary, we deeply care about the environment and the underserved students we help, and delidding helps to maximize performance and service life, ensuring that these devices will be useful for a long time.
Some of AMD's chipsets like A320. B350, and B450 have a metal heatspreader on them, which I of course removed to enable more direct thermal transfer between the chipset die and heatsink. Strangely, removing the IHS doesn't expose the die; just some hard green thermal interface material which doesn't come off with alcohol or scrubbing. B350 doesn't even run hot enough to warrant delidding; I was simply curious to see what lay underneath (and also to be the first person to delid a chipset). And yes, it works just fine after the delidding. I'm typing off this particular desktop right now, actually.
"Yeah, I have RGB."
On a serious note, I actually really dislike most implementations of RGB. It's gaudy, keeps you from sleeping next to your computer at night, makes things more expensive while adding no extra functionality, and the software required can often be poorly written, resource-intensive, or just a big piece of spyware. On my personal hardware, I turn off the LEDs, cover them with electrical tape, or just have them glow a soft white.
About these HDDs--after overwriting each one with zeroes, they each tested good with no pre-fail SMART attributes. They'll go into desktops as secondary storage. While SSDs are the kings of speed, HDDs provide tons of storage space at an unbeatable value, and using an SSD and HDD enables a cost-effective method of both high-speed and high-capacity storage. A similar SSD-HDD setup is what I use in my personal computer.
The two images below depict Magic Machine, my purpose-built Folding@Home rig that I built of a Tesla P100, spare parts, cardboard, packing tape, and tenacity. The Tesla P100 accelerator is what's doing all the hard work; the rest of the system is just there to supply it with power, cooling, and data. Datacenter compute cards like the P100 often don't have an integrated fan, instead relying on the large amount of airflow forced through a server unit by its powerful chassis fans. This is located in my bedroom, so both noise and airflow are a concern. I used an NF-A12x25 chromax.black.swap and an IndustrialPPC-3000 NF-F12x25 with a custom BIOS fan curve. Die edge temperatures max out at approximately 40 C over ambient, so that's approximately 65 C at room temperature, give or take several degrees. I primarily fold for my other 501(c)(3) nonprofit, Folding@SiliconValley.
Also, yes--those are heatsinks on top of that desk lamp. I put them there since the lamp ran quite hot, and I put lots of similar heatsinks inside most computers I service. If you pop open a computer refurbished by The IT Club and find small heatsinks everywhere, it was probably serviced by me.
By the way, if you want to build your own Magic Machine, enabling "Above 4G Decoding" in the BIOS may be necessary to allow you to use datacenter accelerators like the P100.
Update: Magic Machine's getting an upgrade, and Magic Machine 2 might be in the works! I found an i7-8700 system in the e-waste, as well as an i7-4790K. Both CPUs are quite capable for driving more powerful GPUs, as this Celeron's getting a bit limited.
This is an Arctic P12 PWM PST that I pulled out of my desktop while cleaning it out. The dust accumulated during months of near-continuous operation clearly highlights how air flows through this fan. While it's intriguing to look at sometimes, keeping a system dust-free is key to a long lifespan. Dust clogs heatsinks and fans, which can cause systems to overheat and eventually fail (this is part of the reason Macbooks always seem to break within 3-5 years). Furthermore, removing large amounts of dust can lead to electrostatic discharge, potentially damaging sensitive electronics. To get rid of dust, use some compressed air and a dry toothbrush.
Speaking of particulate matter, it is a terrible idea to smoke or vape anywhere near a computer. I don't know what sort of carcinogenic garbage is in those things, but it forms this sticky, sickly-smelling residue inside computers that's almost impossible to clean off. The horrible smell also permeates plastics, and it's actually impossible to remove.
I just thought the air intake of a Core 2 Duo Dell Inspiron looked cool, and decided to take a photo. This laptop's thermal system is actually massively overbuilt. There's a nice, thick heatpipe cooling the CPU and northbridge, and the large, unrestrictive air vent pictured here makes it completely unnecessary for the fan to turn on until the CPU has been running at 100% for some time.
I took this photo when popping the heatsink off a 1 GB Sapphire HD 5450 to reapply thermal paste. In retrospect, I should have altered the color temperature of the part I didn't grey out. Oh well.
This is a good example of dGPU cards that go into some of the desktops that The IT Club services. They're useful as a display output, which is all that's needed for everyday office work (you only need a large, fast GPU for tasks like 3D-intensive gaming, scientific computing, and crypto mining). Though, I overclock each and every single one of them for increased performance, and also because why not--the fresh layer of thermal paste means more thermal headroom, and tiny GPUs like these overclock like mad (fewer transistors and less heat to worry about). This particular card is rock solid with its core and memory frequency cranked as far the sliders will allow.
Update on this particular card: Like so many others, I have added small VRM heatsinks to it.
This is a close-up of part of a DIMM (dual in-line memory module), the standard form factor for desktop RAM (also referred to as memory). DIMMs (and their SODIMM counterpart on laptops) provide a toolless, trivially-easy way to upgrade RAM in computers, prolonging their lifespan and reducing e-waste. Some computer designs, such as most Macbooks, have the RAM directly soldered to the motherboard, making it impossible to upgrade. When purchasing a computer, it's an excellent idea to check if the memory is upgradeable, as it can definitely save money in the long run. A simple RAM upgrade can often be enough to significantly improve the responsiveness of a device, and it's part of The IT Club's standard refurbishment procedure.
Dead hard drives
I found some dead WD Greens in an e-waste bin, so I tore them down and took some photos. Those two yellow squares near the tip of the arm are piezoelectric microactuators, which move the heads more precisely than the big voice coil on the other end, enabling higher recording densities and better vibration tolerance.
Crazy to think how the SD card on the right holds 1,024 times the data as the Memory Stick on the left. Flash memory, just like almost every other single computer-related part, has improved drastically over time.
Function over form
This is the heatsink of an Asrock X99 Taichi motherboard that a very generous New Yorker gifted to me, along with other parts. The motherboard looks like this from the factory, but since the glued-on metal plate is only there for decoration, I removed it because function is far more important than form (the metal plate restricts cooling). Unfortunately, even before the original owner sent over the board, it didn't work. There was a metal shaving dangerously close to the power delivery section, which I suspect caused a short and damaged the board. Hopefully I can repair it.
The drive on the left is a 4.3GB Quantum Bigfoot CY. Its stupendous 5.25" size meant that, for its time, it could store a lot more data than its regular 3.5" desktop counterparts. However, its anemic 3600 RPM spin rate meant that made systems of old feel particularly slow, and it primarily went into cheap, mass-market systems (drives of the era spun at 4500 or 5400 RPM). I found this one in the same system that I pulled the Pentium from earlier out of. Surprisingly, it still functions fine with no bad sectors. While it's definitely not going in an IT Club system due to its age, interface, and capacity, it's a nice, unique piece of tech that's worth keeping around. (And also it sounds nice.)
The one dead HGST drive I've ever encountered
Of all the HGST drives I have encountered, this was the only one that showed any pre-failure SMART attributes, and even then, it was a single reallocated sector. However, since HDDs with any pre-fail SMART attributes are 39 times more likely to fail within 60 days than drives that show up as healthy, we don't take any chances and destroy them. We only use drives that show up as "healthy" in SMART as secondary storage in The IT Club's computers (all computers get SSDs as primary storage to store the OS and programs).
Even then, I encountered this drive nearly four years after establishing The IT Club, and I'm impressed that it took me this long to find a faulty HGST drive. HGST (Hitachi Global Storage Technologies) was bought out by WD in 2012, and the brand was retired in 2018. They were known for making some of the most reliable drives out there, and it's no surprise that WD still uses HGST engineers and designs--much of their current WD Red/Gold lineup looks very similar to Deskstars and Ultrastars of ten years ago.
These are the contents of a package that a very kind friend in New York gave me. The black NF-F12 IndustrialPPC-3000 fan in the top-right quadrant is one of the fans cooling Magic Machine's P100 accelerator card, and the white Asrock board in the bottom-right was the one that I think was damaged by a metal shaving. For now, I plan to use the top-left X99-UD3P board and its i7-5930K for testing hardware. The 5930K's 40 whopping CPU-attached PCIe lanes means it can easily utilize up to four expansion cards simultaneously. In fact, it might be a good candidate for Magic Machine II, since it can handle a second P100. I just need to find a compatible LGA20XX cooler.
Update: I found...many...LGA20XX systems, all with coolers, that my university threw away for some inane reason (they all still work, mind you). Welp, guess that problem is solved...
You wouldn't try and cool a cable...right?
I bought some Noctua NF-A4x10 fans to use in my research a while back, and they came with these Low Noise Adaptors (LNA), which contain a small resistor to decrease the fan's RPM. However, since those NF-A4s are barely audible even at full tilt, I use these for other fans instead. The high current draw of most fans can heat up the resistor considerably, so I added some heatsinking to keep it cool.
Even more delidding
This is a Samsung PM981 SSD, which is the OEM equivalent of the 970 EVO, just with a green PCB, no copper sticker, and a different firmware. The box cutter-looking thing is Jimmy, which is bendy and great for prying (it's thin but not sharp, meaning that it can get into small spaces with minimal risk of accidental damage). It's pretty obvious that the controller, which runs at 110 C, would be fit for some cooling modifications. This is the second M.2 SSD I've delidded (the first one was a 970 EVO). Other things I've delidded include GPUs, chipsets, PlayStation 3s, Chromebooks, NICs, and PLX switches, among others.
This is a WD VelociRaptor, their enthusiast series of 10K RPM drives. Most modern consumer hard drives spin at 5200, 5400, 5900, or 7200 RPM. Higher RPMs provide higher sustained transfer rates and better random-access performance (the head doesn't have to wait as long for the requested data to pass under it) at the expense of increased noise and power consumption. Before SSDs, 10K and 15K drives were the performance kings.
However, they ran hot and loud, and the increased turbulent airflow from the higher RPMs meant that high-performance hard drives like these took a significant capacity hit, since it's harder to position the heads accurate over the desired region on the platter. In 2022, the highest-capacity 15K RPM drive, the Seagate Exos 15E900, goes up to 900 GB, while the highest-capacity 2.5", air-filled drive, the (SMR-equipped) ST5000LM000, goes up to 5TB at 5400 RPM.
Note the "Enterprise Storage" label. VelociRaptors were actually meant to be sold as enterprise drives, and WD simply gave them a SATA interface, a fancy name, and this nice "Icepack" 2.5"-to-3.5" enclosure to sell some of them as consumer drives (though some VelociRaptors were sold as bare 2.5" drives). Their enterprise-focused design is a good thing, since this means that this drive chassis was built with reliability as a priority. And that shows--while I haven't seen more than a handful of them, each VelociRaptor I've come across is in perfect health.
Nvidia's Titan line is a series of enthusiast-grade hardware, either offering the best iterations of consumer-grade hardware, or separating fools from their money, but often both. Fortunately, I was able to purchase this fine example for USD $700, which is $500 less than its MSRP. I sold my 5700 XT in order to get this card.
For the Geforce 10 series consisting of Pascal-architecture-based cards, Nvidia actually had two Titans. The first, the Titan X Pascal, was essentially a 1080 Ti released a few months before the 1080 Ti and which sold at $1200. It was solely made to separate enthusiasts from their money--the 1080 Ti was almost identical on paper, but the improved third-party coolers that companies like MSI, Gigabyte, and EVGA used made many 1080 Tis noticeably faster and cooler than the Titan X Pascal. They also sold for around $500 less.
So, what's up with the Titan Xp (not to be mistaken with the Titan X Pascal)? This card is the true Titan of its generation, packing a fully-enabled GP102 graphics processor, and all the extra goodies like better power delivery, improved efficiency, and faster memory.
I made a very deliberate choice to purchase this over the similar 1080 Ti--while the 1080 Ti has 11 GB of memory (VRAM), the Titan Xp has 12 GB, which helps for things like machine learning and Adobe Creative Cloud applications like Lightroom that leak VRAM. Furthermore, the 7% increase in enabled shaders (the thousands of tiny little processors in a GPU) grants a significant performance advantage in tasks like Folding@Home.
I actually recommend against purchasing a Titan for most users--even with modification, the stock coolers run hot, and they carry very little tangible benefit for gaming, which is what most people use their GPUs for. Furthermore, older, more-affordably-priced Titans, like the Titan Xp, don't have ray-tracing hardware, though this doesn't affect me. Also, since Nvidia ships most Titans with an inadequate stock cooler similar to this, they will run very hot without modification. I plan to watercool mine, which will significantly reduce temperatures.
I regularly post photos on this page! Please feel free to revisit this page every few weeks or so, and you'll probably see something new each time.