Slipstreaming Software into Windows ISO?

So has anyone here ever slip streamed certain software into a Windows ISO?
I am looking to slipstream Adobe Flash, Java, Adobe Reader, Avira Free, Open Office, and possibly other things in the future.
I know some people have said it's easy if you follow directions, and some have said it's a pain in the ass.

I can't for the life of me remember the name of the apps that I used for vista and XP and yeah it was a bit of a pain back then.
for win 7:
for win 8:
for win 10 not sure, I bet it self destructs if you try to make your own customised image unless a similar method to the windows 8 instructions at the bottom also works for 10?

Official Windows Updates and supported Windows components (like the .Net Framework) can be integrated directly using the methods that @TheRabidTech described. The Windows 10 instructions are the same as the Windows 8-8.1 ones but are even easier to do.

Bundling third party Applications is an entirely different ballgame however. There's a couple ways to do what you're thinking. But in summary, the normal OEM way is to:

  1. Obtain Windows.ISO
  2. install image onto (preferably virtual) hardware
  3. install software
  4. (recommended but optional) generalize system using sysprep
  5. boot system using WinPE (USB/CD/PXE/flat)
  6. Capture custom.wim to network share or USB ("dism /capture" or "image /capture")
  7. Merge custom.wim into WinPE.iso contents
  8. Deploy to target system ("dism /apply-image" or "image /deploy")

For OEM or enterprise deployment, step 7 is not necessary since WIM files can be deployed over the network or from USB sticks directly. I posted some software to automate parts 5 to 8 if interested. I called it ADKTools.

I would like to explain why both reactions make sense.

In very general terms (not specific to .ISO files), what you're referring to is the idea of having the applications simply available in an OS environment automatically or "application abstraction." Depending on how exactly you want things to work, and the exact versions of software you want to do it to, and the exact technology to "capture" the applications, it can be either really easy or really difficult.

Application abstraction focuses on either the bundling approach (as described above) or the idea of capturing an application by measuring the differences between the OS by itself and the application once installed. This second "difference" approach is VMware ThinApp technology, and Citrus has an alternative implementation. A key limitation being this technology only works for 32-bit applications (so not Adobe Premiere) as far as I know. I haven't looked into the Citrus implementation however. This second approach tends to be a PIA to set up (but doable) and is easy to maintain.

"scanstate /apps /capture"
This is a free alternative, part of the Windows ADK, to VMware ThinApp without the 64-bit limitation. This will create a "software.ppkg" file that has all of the applications pre-installed.

Pros: True native third party application abstraction. Free, part of Windows. Well supported.
Cons: 2 key limitations: 1) Windows 10 only, and 2) a 10GB maximum compressed capture limit (so still no Adobe CS depending).
Implementation: Simply run the command to generate a software.ppkg capture, place the .ppkg file in the appropriate directory (C:\Recovery I think) and all the apps will be automatically installed when Windows is installed or reset/refreshed. This approach has

Now for the "bundled software" approach that you're interested in:

How easy to do you want the initial capture to be and how scalable do you want it to be? Do you want the building process for the image (in order to change the software packages) to be a manual process or automatic? Should that image work on different hardware or will the system configuration be exactly the same every time?

If you don't need arbitrary different hardware support (unlikely) you can use the .VHD format used by the "Create a system image" option in the Windows Backup and Restore center. The .vhd is a byte-per-byte capture technique and captures partitioning information. It works great for "restore my OS" type scenarios that OEMs have to support.

Pro: easy to create
Cons: Hardware gnostic, changing disk sizes is asking for problems, very "manual" approach. All software must be reinstalled on each system.
Implementation: just shrink the existing OS partition with the apps preinstalled and image the system. VHD images can be deployed/restored using WindowsRE.

If you want a hardware agnostic image, use WIM technology as I described above. Windows IMaging formated images capture file system-level information and captures can be made hardware agnostic by capturing after a sysprep operation. Sysprep can fail and trash the image so use a VM with restoration capabilities. Electronic Software Delivery (ESD) images are highly compressed .WIM images. This is the preferred way to deploy Windows and Microsoft uses this format inside of .ISO files.

Pros: Hardware agnostic third party abstraction, potentially highly scalable, extremely well supported
Cons: Medium complexity overall, Image capture process must be done from WinPE (basically), requires the use of VMs (basically)
Implementation: As described above.

The OEM and mid-large enterprise way of doing things is to do both "difference" style application abstraction with all the available technologies and automatically generate .WIM images using the MDT (using lots of scripting) and virtualization software. Typically this is extremely high complexity to implement requiring a full-time paid IT staff and might be overkill for what you need.

1 Like

If you don't mind me asking, what is the end goal of this? @Peanut253's explaination is EXTREMELY great, and I'd recommend you read it, however if you're planning on system deployments or things of that nature you may need full guides on how to set this up. I'd be more than happy to show you setups for deployment solutions or give you a variety of choices if you're wanting to have a higher level of management than using an iso to image everything.

Great explanation, seriously! I love that you actually explain all of the options to meet his current criteria in the generalized form in a well formatted manner. Just wanted to start by saying that, I appreciate a post that feels like a lot of effort and a certain degree of passion was put into it.

I would argue that MDT doesn't absolutely require a team of people and could be managed by 1-2 people. For example, I currently manage the MDT for my company's Office as well as for a whole health network here. Granted the way that we implement MDT we only slipstream the patches that we want into that WIM and then capture it, and deploy applications at the time of imaging.

I'm interested in how this guy plans on using it, as maybe we can help him further.

I've been doing this in my "spare" time for a year or so it's all in my head atm. Hopefully it can be of use to someone else.

Completely agree. The point I was trying to make there was that if you were going to do 100% automated application abstraction with imaging technology then the MDT would be the way to do it.

The MDT is very flexible supporting deployments of nearly any complexity. Even SCCM really just uses MDT techniques to do what it does. Although if you have the budget for an IT team then hopefully you have the budget for SCCM as well to help them out.


Thanks for everyone's input on this. I just got glasses today and I am getting used to them. So I will have to re-read everything after this god awful headache goes away.

My reason for wanting to do this is simple. I work at a PC Repair shop and what we install are the things listed.
I was hoping to just use a base Windows 7/8/8.1/10 ISO and create an ISO that would slipstream the programs into the install procedure so I spend less time installing those programs and also it get's my customers their PC's back faster.

We don't have any business clients as of right now that need large deployment. So this was basically needed to speed up reinstall's of PCs. Obviously I wouldn't need to implement driver slipstreaming since this will be used on multiple different PC's.

Once again, thanks for everyone's input. Keep it coming so we can help each other out.

I would recommend VM based configuration + wim capture then. That's pretty much what I do now actually. Medium overhead/complexity. Testing is annoying, but at least there's easy rollback, theoretical (future) MDT integration and images are hardware independent.

Note that even when abstracting out drivers, it's still possible to "dism /add-driver" if a driver pack is available for that system after imaging it to save time. This is what enterprises do. The various ways of obtaining and managing drivers could be it's own 2000 word post so I'll just skip that part...

The change I would make to your workflow is to boot WinPE from a PXE server (a WDS from a Server 2012 VM would be perfect for this) or USB sticks. WinPE is part of MS's ADK. Then just image the system over the network with a custom.wim image using dism /apply-image or image /deploy instead of using setup.exe. This makes more sense then trying to work with .ISO files or the unweildy setup.exe unattend.xml scripts (eww).

A slightly more automated alternative is to attempt to use the MDT but that might be overkill.

Also note that images are version/architecture/edition specific. So, if not using hacks, a Win 7 Pro and Win 7 Home Premium would need different VMs and need to be captured separately. The same is true for each different version (7/8.1/10) and of course architectures (x86/x64).

VMware Workstation:

ADK Documentation: (required reading)
ADKTools: (self plug; automates ADK/WinPE building/unicast+usb style deployments)

Regarding the MDT:
So, the ADK and ADKTools are really just a set of tools and scripts. Components really, not a unified workflow. They leave the workflow to whatever you want it to be. If you prefer more "solution" or "software" oriented approaches then use the Microsoft Deployment Toolkit (MDT). I prefer using only the tools themselves with very little abstraction (scripts) since then you can fix anything that goes wrong very easily for smaller deployments (like a PC shops). But if you want to here is the overkill method:

MDT Documentation:
The latest version is 2013 Update 2 available for DL from MS if you google "MDT 2013 Update 2."

1 Like
Install all the apps you want after OS install with one installer you periodically update?
But slipstreaming a few years worth of updates into an ISO saves a lot of windows update time.
I've been mostly using to do most if the updates from update DVD's for each OS after installing OS as cutting down excessive bandwidth usage was my main concern, the machine has to reboot at least 3 or 4 times between batches of updates and takes a few hours to get through them all, you're saving a few hours with each reload slipstreaming the latest updates into a newer ISO every couple of months. It'll make your boss really happy when he see's the productivity increase and more rapid flow of work.

@Peanut253 I have always wanted to dabble in PXE based solutions, but we are a very very small business. We are a two man operation. lol! We do computer repair, phone repair, tablet repair and TV repair. Amazingly we accomplish all of this under just a two man operation. I do all the computers, while we both do everything else.

You've given a lot of great information and I am sure I and many other people will be able to accomplish a lot with what you have. But let's see if I can refine my question a little bit.

I want to have a Windows 7, Windows 8.1, and Windows 10 ISO that has the above mentioned programs installed during the normal installation process. I realize I will need one for each version of Windows. I have plenty of 8GB USB's for this. I was just looking for a way to take an existing ISO, add all of those programs, and now all the lastest updates to it and mount it to a USB and install it to customers computer.

@TheRabidTech I've used Ninite before and I did like it a lot. But it's something that totally slipped my mind that it's something I could use to automate most of this stuff. Thanks for adding that and reminding me of it.

PXE is actually really easy to set up in server 2012 but anyway >.>

Yes, TheRabigTech's idea might be the most workable if the custom.wim VM + capture approach is too daunting. Just install the apps after the OS.

After integrating Windows Updates and software packages as TheRabidTech described in the first response, just use automation scripts after the install to install the actual software. Try to get MSI packages for software since those can usually be installed automatically. Ninite should take care of most of the little ones.

For further automation, AutoIT scripts can take care of installing most of the rest. AutoIT takes a day or two to learn. Then it's very easy to have a folder of software, click on a batch file and have it install all the MSI packages as well as trigger nnite and (optionally) the AutoIT automation scripts for software installers that do not natively support automation.

Lol Yeah I'm sure it is. Main thing here is, were so small we can barely spend $100 on something we would actually need.
Where our business is located, if I told someone it's $120 to replace the LCD in their laptop. They would complain that it's to expensive and ask could I do it for $90.

But thanks for all of your input @Peanut253 . You seem to have a very good head on your shoulders!
Same for you @TheRabidTec.

There are also several tools you could use customize a windows7 iso.
like Vlite, RT7 lite etc etc.


AHHHHH HAAA Thankyou @MisteryAngel Vlite and Nlite where those things I couldn't remember and now there is NTLite?
i think only does XP?
and does 7 through to 10
I think vlite was vista? but there is apparently a version for windows 7 too
I'm going to have a look at ntlite and see how cool it is.

It’s been a while but I remember a couple other methods of adding to the windows install. I apologize if the information is not nearly as complete as what’s been put on display previously or if my info has already been covered… There was a method where adding files in the $OEM folder would copy over files from the install source to the designated directory. A user script would then launch after Windows installed and thus we were able to automate some of our software installs with the Windows install. Another method, one geared more towards drivers, was to edit Microsoft’s answer file. There were two files in the setup that the Windows uses as an answer file. Forgive me, I forget the names. The instructions I followed were in the Windows Server 2000 Deployment Manual. But basically you just add the Filename, Source, and Destination to the list. Of course the filenames were not complete inside the cab file but after some practice, I was able to figure it out. The hardest part was having Windows recognize the changes. If anyone remembers this method and had failed in the past here’s what I did. For some reason Windows didn’t like the default (i think courier) font. You can not just open the file and start typing. What I ended up doing was to copy paste a line of existing text.
Then I placed the curser inside the pasted text, and made the changes finishing by deleting and backspacing to rid the unwanted letters… Windows was able to “read” my changes after that. Weird, but true. I can’t explain why, only that this is how I was able to do it. But I noticed recently that Windows 10 ignored my OEM folder and did not copy anything onto the hdd. Which explains why I am here learning to do it again. This may not help anyone because of changes made to Win 10 but hopefully it explains the failures in the past to someone.

2 Year necro, closed. I assume the OP has it covered. @ed_ritchey if you have a question, create a new thread and link back to this one, we would be glad to help out.