Starting with Terraform, Windows and Azure Part 3

Reading Time: 5 minutes

In the last two posts we got our local toolset going. Now it is time to make sure you can actually connect to your Azure tenant with Terraform. This assumes you have access to an Azure Tennant already. If you don’t then it is really easy to get started with you own Azure Tennant and Microsoft will even throw some credits your way. You can find more information here.

My preferred method of setting up Terraform involves PowerShell, the AzureRM module and a script. The objective is to create a Service Principal and obtain the ClientID and Client Secret. These can be incorporated into you Terraform Scripts or put into environmental variables to keep your terraform templates and modules a bit safer.

macky

 

Now that we got that out of the way, I’ve devised a script to setup the service principal and obtain the credentials.

I Tend to write scripts with mostly functions in them, some I stole, some are my own feel free to use the material provided in any way you see fit. 😉

First some variables:

# Variables that can be defined, or left blank
[String]$azure_client_name=""     	# Application name
[Securestring]$azure_client_secret  # Application password
[String]$azure_group_name=""
[String]$azure_storage_name=""
[String]$azure_subscription_id="" 	# Derived from the account after login
[String]$azure_tenant_id=""       	# Derived from the account after login
[String]$location="West Europe"
[String]$azure_object_id=""

I thought is was a good idea to get the requirements, just to be certain. The following function checks if you have Azure PowerShell installed. If not, you can get is here.

function Requirements() {
	$found=0
	$azureversion = (Get-Module -ListAvailable -Name Azure -Refresh)
	If ($azureversion.Version.Major -gt 0) {
		$found=$found + 1
		Write-Output "Found Azure PowerShell version: $($azureversion.Version.Major).$($azureversion.Version.Minor)"
	}
	Else {
		Write-Output "Azure PowerShell is missing. Please download and install Azure PowerShell from"
		Write-Output "http://aka.ms/webpi-azps"		
	}
	return $found
}

Then we need to get the subscription:

function AskSubscription() {
	$azuresubscription = Add-AzureRmAccount
	$script:azure_subscription_id = $azuresubscription.Context.Subscription.SubscriptionId
	$script:azure_tenant_id = $azuresubscription.Context.Subscription.TenantId		
}

Some code to generate a Random complex password; Note, you’ll get the option to put in your own password if you prefer.

Function RandomComplexPassword () {
	param ( [int]$Length = 8 )
 	#Usage: RandomComplexPassword 12
 	$Assembly = Add-Type -AssemblyName System.Web
 	$RandomComplexPassword = [System.Web.Security.Membership]::GeneratePassword($Length,2)
 	return $RandomComplexPassword
}

We also need to provide a name for our service principal.

function AskName() {
	Write-Output ""
	Write-Output "Choose a name for your client."
	Write-Output "This is mandatory - do not leave blank."
	Write-Output "ALPHANUMERIC ONLY. Ex: mytfdeployment."
	Write-Output -n "> "
	$script:meta_name = Read-Host
}

Now the time comes to use the random complex password and convert it to a secure string for creating the service principle name.

function AskSecret() {
	Write-Output ""
	Write-Output "Enter a secret for your application. We recommend generating one with"
	Write-Output "openssl rand -base64 24. If you leave this blank we will attempt to"
	Write-Output "generate one for you using .Net Security Framework. THIS WILL BE SHOWN IN PLAINTEXT AND MIGHT NOT WORK, so please make your own."
	Write-Output "Ex: myterraformsecret8734"
	Write-Output -n "> "
	$script:azure_client_secret = Read-Host
	if ($script:azure_client_secret -eq "") {
		$script:azure_client_secret = RandomComplexPassword(43)
	}	
	Write-Output "Client_secret: $script:azure_client_secret"
	$script:password = ConvertTo-SecureString $script:azure_client_secret -AsPlainText -Force
}

Time to create the Service Principle.

function CreateServicePrinciple() {
	Write-Output "==> Creating service principal"
	$app = New-AzureRmADApplication -DisplayName $meta_name -HomePage "https://$script:meta_name" -IdentifierUris "https://$script:meta_name" -Password $script:password
 	New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId	
	#sleep 30 seconds to allow resource creation to converge
	Write-Output "Allow for 30 seconds to create the service principal"
	Start-Sleep -s 30
 	New-AzureRmRoleAssignment -RoleDefinitionName Owner -ServicePrincipalName $app.ApplicationId.Guid
	$script:azure_client_id = $app.ApplicationId
	$script:azure_object_id = $app.ObjectId
	if ($error.Count > 0)
	{
		Write-Output "Error creating service principal: $azure_client_id"
		exit
	}
}

Let's generate some output, so you can copy that and store it somewhere safe! And I mean SAFE! It's access to your Azure subscription!

function ShowConfigs() {
	Write-Output ""
	Write-Output "Use the following configuration for your Terraform scripts:"
	Write-Output ""
	Write-Output "{"
	Write-Output "      'client_id': $azure_client_id,"
	Write-Output "      'client_secret': $azure_client_secret,"
	Write-Output "      'subscription_id': $azure_subscription_id,"
	Write-Output "      'tenant_id': $azure_tenant_id"
	Write-Output "}"
	Write-Output ""
	Write-Output "Use the following Environmetal variable direct in PowerShell Terminal"
	Write-Output ""
	Write-Output "`$env:ARM_CLIENT_ID=`"$azure_client_id`""
	Write-Output "`$env:ARM_CLIENT_SECRET=`"$azure_client_secret`""
	Write-Output "`$env:ARM_TENANT_ID=`"$azure_tenant_id`""
	Write-Output "`$env:ARM_SUBSCRIPTION_ID=`"$azure_subscription_id`""
	Write-Output ""
}

And now to assemble it all in a script and add the code to actually run all the functions 😀

# Variables that can be defined, or left blank
[String]$azure_client_name=""     	# Application name
[Securestring]$azure_client_secret  # Application password
[String]$azure_group_name=""
[String]$azure_storage_name=""
[String]$azure_subscription_id="" 	# Derived from the account after login
[String]$azure_tenant_id=""       	# Derived from the account after login
[String]$location="West Europe"
[String]$azure_object_id=""


function Requirements() {
	$found=0
	$azureversion = (Get-Module -ListAvailable -Name Azure -Refresh)
	If ($azureversion.Version.Major -gt 0) {
		$found=$found + 1
		Write-Output "Found Azure PowerShell version: $($azureversion.Version.Major).$($azureversion.Version.Minor)"
	}
	Else {
		Write-Output "Azure PowerShell is missing. Please download and install Azure PowerShell from"
		Write-Output "http://aka.ms/webpi-azps"		
	}
	return $found
}

function AskSubscription() {
	$azuresubscription = Add-AzureRmAccount
	$script:azure_subscription_id = $azuresubscription.Context.Subscription.SubscriptionId
	$script:azure_tenant_id = $azuresubscription.Context.Subscription.TenantId		
}

Function RandomComplexPassword () {
	param ( [int]$Length = 8 )
 	#Usage: RandomComplexPassword 12
 	$Assembly = Add-Type -AssemblyName System.Web
 	$RandomComplexPassword = [System.Web.Security.Membership]::GeneratePassword($Length,2)
 	return $RandomComplexPassword
}

function AskName() {
	Write-Output ""
	Write-Output "Choose a name for your client."
	Write-Output "This is mandatory - do not leave blank."
	Write-Output "ALPHANUMERIC ONLY. Ex: mytfdeployment."
	Write-Output -n "> "
	$script:meta_name = Read-Host
}

function AskSecret() {
	Write-Output ""
	Write-Output "Enter a secret for your application. We recommend generating one with"
	Write-Output "openssl rand -base64 24. If you leave this blank we will attempt to"
	Write-Output "generate one for you using .Net Security Framework. THIS WILL BE SHOWN IN PLAINTEXT."
	Write-Output "Ex: myterraformsecret8734"
	Write-Output -n "> "
	$script:azure_client_secret = Read-Host
	if ($script:azure_client_secret -eq "") {
		$script:azure_client_secret = RandomComplexPassword(43)
	}	
	Write-Output "Client_secret: $script:azure_client_secret"
	$script:password = ConvertTo-SecureString $script:azure_client_secret -AsPlainText -Force
}

function CreateServicePrinciple() {
	Write-Output "==> Creating service principal"
	$app = New-AzureRmADApplication -DisplayName $meta_name -HomePage "https://$script:meta_name" -IdentifierUris "https://$script:meta_name" -Password $script:password
 	New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId	
	#sleep 30 seconds to allow resource creation to converge
	Write-Output "Allow for 30 seconds to create the service principal"
	Start-Sleep -s 30
 	New-AzureRmRoleAssignment -RoleDefinitionName Owner -ServicePrincipalName $app.ApplicationId.Guid
	$script:azure_client_id = $app.ApplicationId
	$script:azure_object_id = $app.ObjectId
	if ($error.Count > 0)
	{
		Write-Output "Error creating service principal: $azure_client_id"
		exit
	}
}

function ShowConfigs() {
	Write-Output ""
	Write-Output "Use the following configuration for your Terraform scripts:"
	Write-Output ""
	Write-Output "{"
	Write-Output "      'client_id': $azure_client_id,"
	Write-Output "      'client_secret': $azure_client_secret,"
	Write-Output "      'subscription_id': $azure_subscription_id,"
	Write-Output "      'tenant_id': $azure_tenant_id"
	Write-Output "}"
	Write-Output ""
	Write-Output "Use the following Environmetal variable direct in PowerShell Terminal"
	Write-Output ""
	Write-Output "`$env:ARM_CLIENT_ID=`"$azure_client_id`""
	Write-Output "`$env:ARM_CLIENT_SECRET=`"$azure_client_secret`""
	Write-Output "`$env:ARM_TENANT_ID=`"$azure_tenant_id`""
	Write-Output "`$env:ARM_SUBSCRIPTION_ID=`"$azure_subscription_id`""
	Write-Output ""
}

$reqs = Requirements	
	if($reqs -gt 0)
	{
		AskSubscription
		AskName
		AskSecret
		CreateServicePrinciple
		ShowConfigs
	}

 

Just copy paste the script above and check the content…(it's a good habit to check all the code you rip of the web!) and save it as azure-setup.ps1.

You will get an output giving all the info needed to be able to deploy your first resources to Azure.

TLDR;

There are more than one way to setup Terraform and obtain your credentials for Azure. Methods Involving the CLI or the Azure Portal are valid and more information can be found here. The above script fixes everything in about a minute.

Starting with Terraform, Windows and Azure Part 2

Reading Time: 3 minutes
Installation

In the previous post I explained how to setup the Windows environmental variable PATH so you can run the terraform executable from any path. In this post I will explain how I set up my favorite text file editor. In the last year or so I came to fall in love with Microsoft Visual Studio Code. It runs on Windows, macOS and Linux, is very versatile and got al kind of neat features. At least try it and decide for yourself.

You can download the installation media here.

When you are done installing you will be greeted with a welcome screen that looks something like this:

image

Settings

Now first lets add some nice settings. To access the settings you can press the following key combination:

Ctrl+, (Control and Comma)

This will open up the User Settings json file:

image

On the left side are the Default Settings. You can mess with those, but it’s better to put your custom settings in the user settings.

The following is an example of my settings:

{
"material-icon-theme.showWelcomeMessage": false,
"workbench.iconTheme": "material-icon-theme",
// 64-bit PowerShell if available, otherwise 32-bit
"terminal.integrated.shell.windows": "C:\\Windows\\sysnative\\WindowsPowerShell\\v1.0\\powershell.exe",
"window.zoomLevel": 0,
"git.confirmSync": false,
"workbench.panel.location": "bottom",
"editor.minimap.enabled": true
}

I can recommend Minimap.enabled. This Enables a map in the right side of the editor that shows a small version of the file you are editing. This is usefull navigating large text files.

Extensions

Pressing the Ctrl+Shift+X will open up the Extension Tab on the left side of the window:

image

When you first start out it should be empty. I can recommend the following Extensions:

  • 1337 Theme
  • Advanced Terraform
  • Azure Resource Manager Tools
  • Azure Resource Manager Snippets
  • Material Icon Theme
  • PowerShell
  • Simple Terraform Snippets
  • Terraform
  • Terraform Autocomplete

You can type in these values at the ‘Search in Extensions Marketplace‘ field. Visual Studio Code will also recommend Extensions based on the files you are working with. But the ones above I find great for editing Terraform files.

Console

You van start the console by typing Ctrl+~. In windows this should default to PowerShell. This can also be set to Bash for instance. We’ll keep it on PowerShell for now. Visuals Studio Code also supports the running of selected code by pressing F8. The entire code can be executed by pressing F5. Keep in mind the code execution works when running PowerShell cmdlets. More keyboard shortcuts are found here.

Explorer

Ctrl+Shift+E opens the explorer tab. In this tab you can either open files or entire folders. In the case of the later the contents of the entire folder will be displayed. This is useful when editing multiple files simultaneously. In fact, Visual Studio Code even adds an ‘Open with Code’ feature to the context menu (right-click). This will work for folders and Files.

TLDR;

Download Visual Studio Code, it’s great for Terraform and great for scripting in general.

Starting with Terraform, Windows and Azure Part 1

Reading Time: 2 minutes

This is a series of blog posts going in to the setup of Terraform and building your first Azure deployment. First we are going to the local installation. Terraform is able to run on a variety of operating systems:

  • MacOS
  • FreeBSD
  • Linux
  • OpenBSD
  • Solaris
  • Windows

Since this blog Is mostly about Microsoft, Windows and Azure related stuff I’m going to cover the Windows version. Fist of all, download the Terraform executable for your Windows installation (32 of 64 bit) right here.

Extract the zip package to a location on your computer, for instance:

c:\Terraform

I would also recommend to add this location to the ‘PATH’ environmental variable in Windows so you can actually run this from any location so you don’t have to type extensive paths every time you are doing deployments.

To make it easy I’ve devised a PowerShell script:

# Terraform Executable location, modify for a different location
$TerraformPath='c:\Terraform'
# Get the PATH environmental Variable
$Path=(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path
# Create New PATH environmental Variable
$NewPath=$Path+";"+$TerraformPath
# Set the New PATH environmental Variable
Set-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH –Value $NewPath
# Verify the Path
(Get-ItemProperty -Path 'Registry::HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment' -Name PATH).Path

This should allow you to run terraform from any path on your machine. You can try this opening a PowerShell session and running Terraform:

image

So the local terraform is all set up. That wasn’t too hard. In the next post I will visit the tools I use to write Terraform templates (and a lot of other scripts/things).

Certificate Services and Hardware Security Modules

Reading Time: 3 minutes

A lot of deployment of Active Directory Certificate Services is never deployed with an Hardware Security Module (HSM). Now this does not have to be a problem depending on the use of the issued certificates. In some deployments however it can be a serious security risk not to incorporate a HSM into the design.

What is a Hardware Security Module

A HSM is a computing device which can generate, store and safeguard digital keys. For
example private keys of certificates. Usually these devices come in the form of an appliance or a PCI card. There are some examples of USB type HSM to be found. Since these devices almost always fulfill apparently a critical role within an security solutions they are typically certified to recognized standards such as Common Criteria and FIPS 140. Furthermore, a lot of these devices have tampering protection which can go as far as deleting all information stored in the HSM when tampering is detected.

Why would you require an HSM.

A Couple of years back there was a problem with a public certificate provider. There was a breach where apparently the root certificate was stolen and false certificates where being issued to services that weren’t to be trusted but appeared trusted at the time. To say this was a bad thing isn’t even scratching the surface.

Using an offline Root Certificate Authority (CA) can be a great help in keeping the certificate chain safe, but remember that on issuing CA ’s without a HSM an account with administrator privileges will be able to issue certificates. He will also be able to export a certificate with the private key, and even make that exportable. Thus creating a certificate that, in theory, can be used anywhere without any control over it.

An HSM stores and guards the private key, so even if someone with an administrator account logs onto an issuing or even a root CA, he or she (yes, she…) need to unlock the HSM first before they can issue a certificate. You can even set up the HSM to require more that identity set to unlock it. This will ensure that no one person can go and create certificates.

For an organization to validate their need for the use of HSM’s the following question is important:

What would be the cost of the worst case scenario when your Public Key Infrastructure was compromised?

If the number from that question was higher then say the price of an HSM wouldn’t that make a compelling argument to use HSM’s?
If your organization makes use of a PKI (or any cryptography) for any security reason I would recommend a HSM. Even if it’s just to make sure the private key of your only root CA never leaves the datacenter.

Are there alternatives

There aren’t really alternatives. The only alternative you have is an offline network. You will know all the data is in that network and can’t get out. You will also need to lock down things like USB ports and such to prevent any form of data going in or out of it. Everything considered it’s sound like a crappy solution to me.
Fortunately HSM’s don’t need to be super expensive. You can go from €50 to €50000 when

HSM’s are considered. Off course the more expensive they are the more powerful their cryptographic capabilities become. For simply hardening your security of you CA in a small organization a USB solution may suffice.

Legislation

I am not an expert on the subject of law enforcement. But considering the recent case of the FBI versus Apple there might be some legislation considering encryption. It is a good thing to keep in mind that you want to check you are not breaking the law by accident but on purpose.

Concluding

If you use a form of cryptography for security solutions within your organization please consider hardening security with a Hardware Security Module. I do not want to exaggerate but in some cases a compromised security solution such as a Public Key Infrastructure can even compromise the safety off peoples lives.
Keep your environment safe, lives may depend on it 😀

Help the Outlooks are down

Reading Time: 2 minutes

So I was contacted by a panicking client. It seemed that all of his Outlook clients could not connect to Office 365 anymore. That meant investigating what was wrong. Upon inquiring he admitted that he changed some DNS settings the day before. But only the SPF record. That didn’t explain connection issues with their Outlook clients. Naturally the first thing I checked where there public DNS settings. There did not seem to be anything out of the ordinary there, apart from a tiny mistake in the SPF record they created. That did not present any insight in what might was causing this problem.

A couple of days earlier we did renew the Exchange certificate on the on premise Hybrid server. But since the auto-discover DNS records where pointing at Office 365 this should not be a problem.

So I turned to my trusty connection testing toolset provided by our friends at Microsoft: https://testconnectivity.microsoft.com/
There on the Office 365 tab I ran the Outlook connectivity test. The following picture is a screenshot of a part of the test outcome. Funny thing that HTTP 503 error for the Office 365 auto-discover service.

Connectivity Test

A little web research suggested to recreate the federation link with Office 365. I felt that would be a little bit of exaggeration. What else could be responsible for an Office 365 service being unavailable? Needless to say, I tested another tenant. That one seemed to have no problem what so ever. Then it hit me. A quick question to their administrator if anything had changed at the Federation servers of domain controllers confirmed this. Yes, updates where installed, zero reboots given. Great, go reboot those machines….
5 minutes later I got a very happy and relieved sysadmin on the phone confirming that everything was working again. He also informed me they where able to log into on premise servers again. He forgot to mention that fact in an earlier conversation…..*sigh*

Concluding

If you cannot log into your federated Office 365 environment, check your Domain Controllers and Federation servers. Something might be out of order there.

Surface Pro 3 Pen Button Windows 10 Bug

Reading Time: 2 minutes

So I've got this lovely shiny Surface Pro 3. The Screen being al touchy, the OS being all Windows 10 Pro, and the pen button starting up the modern OneNote App. However, ,in windows 8.1 I had a choice between the modern App version and the Desktop version of OneNote. A quick review of the interwebs gave me nothing. So…

Let's investigate…when pressing the blue button, the modern OneNote App pops up…but I want the desktop OneNote App to start.

Let's remove the modern App since I am not going to use it anyway.

There is a nifty powershell command just for the purpose of uninstalling things that windows does not want you to uninstall. (Does not work for Edge)

Start Powershell as an Administrator and run the following:

Get-AppxPackage *OneNote*

If it generates output as in the screenshot it is installed, and piping the command to Remove-AppxPackage should remove the app.

Now, when pressing the button, a Windows store Dialog informs me of the fact that nothing can run an OneNote-cmd and I should consider installing a program that can. But wait…everything is in the registry. A quick search in the windows registry for 'OneNote-cmd' finds it here:

HKEY_CLASSES_ROOT\onenote-cmd\

It was Empty. So adding the following keys: Shell\Open\Command. At the default I set the following: "C:\PROGRA~1\MICROS~1\Office15\ONENOTE.EXE"

And presto, pressing the blue button opens OneNote for Desktop. You can also start other executables this way.

Back from LyncConf 2014

Reading Time: 3 minutes

20140217_015502923_iOSRobin Gilijamse and I attended LyncConf 2014 in Las Vegas last February. We had a really good time attending parties from the UCArchitects and several other sponsers. We enjoyed some real American food an explored the strip a bit. The weather even permitted a dive in the Resort pool and we generally had a really, really good time.

Keynote

Apart from the good times there where also some tech sessions we have attended. Kicking off with the Keynote was Gurdeep Singh Pall, the Vice President for Lync & Skype Engineering at Microsoft. During the keynote he suggested to replace ‘unified communications’ with the term ‘universal communications’. This signifies the direction Microsoft is heading with Lync. To create a communication platform that is available on any device, at any location and even transcends the boundary between consumer and enterprise platforms as Lync and Skype possibilities will be further developed.

20140218_175108000_iOSHe Announced Full HD video chat will become available between Lync and Skype this summer. Also native interoperability with Tandberg Video Conferencing devices is coming in the near future.

Furthermore they are aiming to create a consistent experience for work and personal situations by bringing Lync and Skype closer together.

Apart from the Tandberg en Skype – Lync video chat there are some more thing Gurdeep addressed. There is a javascript wrapper heading our way that will let you run Lync in a browser. There should be regular updates for the mobile clients. Lync Online will get PSTN dial-in/dial-out support and support for meetings with +1000 users.

More details on this can be found on this blogpost by Gurdeep.

Ecosytem

It became clear while roaming the Tech Expo there are quite a few developers and suppliers of applications and peripheral for use with Lync. From status lights you van put on your desk to Video Conferencing System, headsets, desk phones and Survivable Brach Appliances. It becomes apparent Lync is growing and business is booming.

Fortunately there are quite a few API’s so the number of options is expanding.

This creates opportunity for everyone to expand upon the existing technology and come up with their own solutions. Also it gives customers great choice in all kinds of third party solutions, ranging from headsets, presence lights, to monitoring software and big video conference systems.

Concluding

20140220_040650205_iOS Lync is serious business. The Ecosystem is getting better and better. Microsoft is going lengths to ensure Skype and Lync will deliver a seamless communication experience. A lot of third parties are creating products to enhance and expand the capabilities of Lync, both through software and hardware solutions.

At this moment Lync is available on a wide range of devices. The Android tablets will be added this year,  leaving out Linux and a new client for Mac OS X (they still have a somewhat functioning 2011 client though).

If Microsoft keeps this up, Lync will become a real Universal Communications Solution in the years to come.

Dirteam Bloggers at LyncConf 2014

Reading Time: 2 minutes

From Saturday February 15, 2014 to February 22, 2014, I will be in Las Vegas to attend LyncConf 2014. I’m very excited! Together with my colleague and fellow Dirteam blogger Robin Gilijamse, I will be getting all the ins and outs on the subject of Microsoft Lync and Unified Communications.

The 2014 Lync Conference will be held at the Aria Resort and Casino (Las Vegas, what did you expect) at the Las Vegas Boulevard.

Last year I went to TechEd Europe 2013 in Madrid with Sander Berkouwer and Maarten de Vreeze. That was the first big Microsoft event I attended. It was totally awesome, despite the funny hotel. So this will be my ‘big’ second event.

I passed my ‘Microsoft Certified Solutions Expert (MCSE): Communications’ last December with some instructional help from Robin Gilijamse. Therefore, I was granted the opportunity to go by my employer.

If everything goes according to plan, I will be a lot more knowledgeable on troubleshooting and reading Lync logs. Also I'm planning to get more information about the various voice options and Survivable Branch Appliances (SBA).

Our flight will leave from Amsterdam Schiphol Airport (AMS) in the morning and we’ll be making a short stop at Detroit (DTW). I heard it was freezing over there. Usually it is freezing in the Netherlands this time of year, although not at the moment. Fortunately, Las Vegas is +20 degrees Celsius. After our short stop in Detroit, we’ll be arriving at Las Vegas (LAS) at 16:54 local time.

Viva Las Vegas!

Las Vegas Strip 2009

Hope to see all you fellow Lync enthusiasts there!

SkyDrive crashes on my domain machine

Reading Time: < 1 minute

I while ago I wrote about restoring your Win-X menu when this did not show up after migrating from a Windows 7 member client to a Windows 8 member client. And by member client I mean active directory member.

It turns out this problem still exists in Windows 8.1. Also in Windows 8.1 there seems to be a problem with your SkyDrive (OneDrive) integration. As in some cases it turns out that your SkyDrive will crash and not sync.

So here is a nice script to fix your Win-X menu and Skydrive:


*** START OF SCRIPT ***

reg add

HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\CLSID\{8E74D236-7F35-4720-B138-1FED0B85EA75}\ShellFolder /v Attributes /t reg_dword /d 0x00100000 /f

reg delete "HKEY_LOCAL_MACHINE\Software\Microsoft\Active Setup\Installed Components\WinX-fix-%COMPUTERNAME%" /f >NUL 2>NUL

reg add "HKEY_LOCAL_MACHINE\Software\Microsoft\Active Setup\Installed Components\WinX-fix-%COMPUTERNAME%" /v Version /t REG_SZ /d "1" /f

reg add "HKEY_LOCAL_MACHINE\Software\Microsoft\Active Setup\Installed Components\WinX-fix-%COMPUTERNAME%" /v StubPath /t REG_SZ /d "\"%ProgramFiles%\windows-8-fixes\fixWinX.cmd\"" /f

Reg add "HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Policies\Explorer" /v DisablePersonalDirChange /t REG_DWORD /d 0 /f

xcopy /S /C /I /Q /H /R /Y "%PUBLIC%\..\Default\AppData\Local\Microsoft\Windows\WinX\**" "%LocalAppData%\Microsoft\Windows\WinX"

*** END OF SCRIPT ***


If for any reason the service formally known as SkyDrive is not working after the script above, try the following


Reg add "HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Policies\Explorer" /v DisablePersonalDirChange /t REG_DWORD /d 0 /f


Also, this script can be incorporated into your deployment image, but also be deployed through active directory. Remember that this is a CMD script, and this can also be done trough PowerShell cmd-lets.

Just remember you need to run this with administrative privileges.

Uninstall PowerShell from Windows Server 2012; And get it back

Reading Time: 4 minutes

A couple of days ago I was doing a deployment. It was one of those days where nothing seemed to work. Everyone is familiar with them and probably detest them as much as I do: the type of non-productive days where nothing productive gets done.

I actually had a lot of setbacks. Firewalls being too restrictive. Network traffic not routed correctly. Group Policy Objects (GPOs) magically breaking my clients Active Directory connections. And then, instead of throwing my laptop, I wanted to start from scratch with the server I was working on at that particular moment.

I got a little more than I bargained for. So in my infuriated state I decided it was a good idea to remove all features and roles from the server. I opened server manager, choose to remove roles and features, and deselected all options, not really paying attention to what I was doing.

Knipsel2

The Server Manager gave me a summary of the jobs at hand, but of course I did not read it. I clicked the close button and went for a cup of coffee.

Knipsel6

When I came back the server innocently asked me to reboot it. Suggesting that was needed to complete the task I had assigned. Giving it no secondary thought I gave the command to shutdown forced and reboot immediately.

When it came back up, I logged in and was happily greeted by a command prompt shell. Thinking I accidentally uninstalled the graphical user interface from Windows Server 2012, I wasn’t worried for a second. So I gave command to start PowerShell and was presented with the following error message.

Knipsel7

Oops…

What did I just do? Uninstall everything. And I really mean everything. Turns out that unchecking everything includes .NET 4.5 and thus removing everything, leaves your server quite useless. Something like a Windows 2008 Server Core installation.

Then I remembered “Deployment Image Servicing and Management”. Usually referred to as DISM. First, I checked what features where available. I put the output to a text file because otherwise it wouldn’t be really readable.

DISM /Online /Get-Features > features.txt

Knipsel10

As you can see, almost everything is disabled. This gives new meaning to "build from scratch".

PowerShell Engage

then I used the following command to install PowerShell again:

DISM /Online /Enable-Feature:MicrosoftWindowsPowerShell /all

Notice the /all switch at the end of the command. This tells DISM to install the prerequisites as well.

Knipsel14

The PowerShell command now works again.

To the GUI

Still there is no Graphical User Interface. And for the sake of this blog I used DISM to install that as well:

DISM /Online /Enable-Feature:Server-GUI-Shell /all

This will ask for a reboot.

Then the server will install the GUI again and will be manageable by IT Pros who don’t like the command line that much.

Knipsel19

And after this operation completes and you log in, the GUI is back.

Knipsel20

Then I went on with the deployment as planned and this server is now running the Remote Access Role and configured to manage DirectAccess for a couple of Windows 8 clients. It's running this without any problems.

To Sum up

You can remove PowerShell from Windows Server 2012 by using the Server Manager and selecting the Remove Roles and Features option.

Afterwards you will boot into a Command Prompt where PowerShell will not work because it is not enabled.

I used the following DISM commands to enable the feature again:

DISM /Online /Enable-Feature:MicrosoftWindowsPowerShell /all

With PowerShell enabled, it should be a breeze to fix the server some more.

Keep in mind that this server still had its source files. There are methods to add or remove the source files.

Concluding

Windows Server 2012 is very modular. Almost everything is removable. There might be scenarios where this can be useful. It might just want to be something to keep in mind.