Updating the Photo Attributes in Active Directory with Powershell

Today I got to have the joys of needed to once again get caught up on importing employee photos into the Active Directory photo attributes, thumbnailPhoto and jpegPhoto. While this isn’t exactly the most necessary thing on Earth it does make working in a Windows environment “pretty” as these images are used by things such as Outlook, Lync and Cisco Jabber among other. In the past the only way I’ve only ever known how to do this is by using the AD Photo Edit Free utility, which while nice tends to be a bit buggy and it requires lots of repetitive action as you manually update each user for each attribute. This year I’ve given myself the goal of 1) finally learning Powershell/PowerCLI to at least the level of mild proficiency and 2) automating as many tasks like this as possible. While I’ve been dutifully working my way through a playlist of great PluralSight courses on the subject, I’ve had to live dangerously a few times to accomplish tasks like this along the way.

So long story short with some help along the way from Googling things I’ve managed to put together a script to do the following.

  1. Look in a directory passed to the script via the jpgdir parameter for any images with the file name format <username>.jpg
  2. Do an Active Directory search in an OU specified in the ou parameter for the username included in the image name. This parameter needs to be the full DN path (ex. LDAP://ou=staff,dc=foo,dc=com)
  3. If the user is found then it will make a resized copy of the image file into the “resized” subdirectory to keep the file sizes small
  4. Finally the resized image is then set as the both the thumbnailPhoto and jpegPhoto attribute for the user’s AD account

So your basic usage would be .\Set-ADPhotos.ps1 -jpgdir "C:\MyPhotos" -OU "LDAP://ou=staff,dc=foo,dc=com" . This should be easily setup as a scheduled task to fully automate the process. In our case I’ve got the person in charge of creating security badges feeding the folder with pictures when taken for the badges, then this runs at 5 in the morning each day automatically.

All that said, here’s the actual script code:

 

Did I mention that I had some help from the Googles? I was able to grab some great help (read Ctrl+C, Ctrl+V) in learning how to piece this together from a couple of sites:

The basic idea came from https://coffeefueled.org/powershell/importing-photos-into-ad-with-powershell/

The Powershell Image Resize function: http://www.lewisroberts.com/2015/01/18/powershell-image-resize-function/

Finally I’ve been trying to be all DevOpsy and start using GitHub so a link to the living code can be found here: https://github.com/k00laidIT/Learning-PS/blob/master/Set-ADPhotos.ps1

Getting Started with rConfig on CentOS 7

I’ve been a long time user of RANCID for change management on network devices but frankly it’s always left me feeling a little bit of a pain to use and not particularly modern. I recently decided it was time for my OpenNMS/RANCID server to be rebuilt, moving OpenNMS up to a CentOS 7 installation and in doing so thought it was time to start looking around for an network device configuration management alternative. As is many times the way in the SMB space, this isn’t a task that actual budgetary dollars are going to go towards so off to Open Source land I went!  rConfig immediately caught my eye, looking to me like RANCID’s hipper, younger brother what with its built in web GUI (through which you can actually add your devices), scheduled tasks that don’t require you to manually edit cron, etc. The fact that rConfig specifically targets CentOS as its underlaying OS was just a whole other layer of awesomesauce on top of everything else.

While rConfig’s website has a couple of really nice guides once you create a site login and use it, much to my dismay I found that they hadn’t been updated for CentOS 7 and while working through them I found that there are actually some pretty significant differences that effect the setup of rConfig. Some difference of minor (no more iptables, it’s firewalld) but it seems httpd has had a bit of an overhaul. Luckily I was not walking the virgin trail and through some trial, error and most importantly google I’ve now got my system up and running. In this post I’m going to walk through the process of setting up rConfig on a CentOS minimal install with network connectivity with hopes that 1) it may help you, the two reader’s I’ve got, and 2) when I inevitably have to do this again I’ll have documentation at hand.

Before we get into it I will say there are few artistic licenses I’ve taken with rConfig’s basic setup.

  1. I’ll be skipping over the network configuration portion of the basic setup guide. CentOS7 has done a great job of having a single configuration screen at install where you setup your networking among other things.
  2. The system is designed to run on MySQL but for a variety of reasons I prefer MariaDB. The portions of the creator’s config guide that deal with these components are different from what you see here but will work just fine if you do them they way described.
  3. I’m virtualized kind of guy so I’ll be installing the newly supported open-vm-tools as part of the config guide. Of course, if you aren’t installing on ESXi you won’t be needing these.
  4. Finally before proceeding please be sure to go ahead and run a yum update to make sure everything’s up to date and you really do have connectivity.

Disabling Stuff

Even with the minimal installation there are things you need to stop to make things work nice, namely the security measures. If you are installing this in the will this would be a serious no no, but for a smaller shop behind a well configured firewall it should be ok.

vi /etc/sysconfig/selinux

Once in the file you need to change the “ SELINUX=enforcing ” line to “ SELINUX=disabled “. To do that hit “i” and then use vi like notepad with the arrow keys. When done hit Esc to exit insert mode and “ :wq ” to save and exit.

Installing the Prerequisites

Since we did the minimal install there are lots of things we need to install. If you are root on the box you should be able to just cut and paste the following into the cli and everything gets installed. As mentioned in the original Basic Config Guide, you will probably want to cut and past each line to make sure everything gets installed smoothly.

Autostart Services

Now that we’ve installed all that stuff it does us no good if it isn’t running. CentOS 6 used the command chkconfig on|off to control service autostart. In CentOS 7 all service manipulation is now done under the systemctl command. Don’t worry too much, if you use chkconfig or service start both at this point will still alias to the correct commands.

Finalize Disable of SELinux

One of the hard parts for me was getting the step 5/6 in the build guide to work correctly. If you don’t do it the install won’t complete, but it also doesn’t work right out of the box. To fix this the first line in prerequisites installs the attr package which contains the setfattr executable. Once that’s installed the following checks to see if the ‘.’ is still in the root directories ACLs and removes it from the /home directory. By all means if you know of a better way to accomplish this (I thought of putting the install in the /opt directory) please let me know in the comments or on twitter.

MySQL Secure Installation on MariaDB

MariaDB accepts any commands you would normally use with MySQL. the mysql_secure_installation script is a great way to go from baseline to well secured quickly and is installed by default. The script is designed to

  • Set root password
  • Remove anonymous users
  • Disallow root logon remotely
  • Remove test database and access to it
  • Finally reload the privilege tables

I tend to take all of the defaults with the exception of I allow root login remotely for easier management. Again, this would be a very bad idea for databases with external access.

Then follow the prompts from there.

As a follow up you may want to allow remote access to the database server for management tools such as Navicat or Heidi SQL. To do so enter the following where X.X.X.X is the IP address you will be administering from. Alternatively you can use root@’%’ to allow access from anywhere.


Configure VSFTPd FTP Software

Now that we’ve got the basics of setting up the OS and the underlying applications out of the way let’s get to the business of setting up rConfig for the first time. First we need to edit the sudoers file to allow the apache account access to various applications. Begin editing the sudoers file with the visudo  command, arrow your way to the bottom of the file and enter the following:

rConfig Installation

First you are going to need to download the rConfig zip file from their website. Unfortunately the website doesn’t seem to work with wget so you will need to download it to a computer with a GUI  and then upload it via SFTP to your rConfig server. (ugh) Once the file is uploaded to your /home directory back at your server CLI do the following commands

Next we need to copy the the httpd.conf file over to /etc/httpd/conf directory. This is where I had the most issues of all in that the conf file included is for httpd in CentOS 6 and there are some module differences between 6 and 7. Attached here is a modified version that I was able to get working successfully after a bunch of failures. The file found here (httpd.txt) will need to replace the existing httpd.conf before the webapp will successfully start. If the file is copied to the /home/rconfig directory the shell commands would be

As long as the httpd service starts backup up correctly you should now be good to go with the web portion of the installation which is pretty point and click. Again for the sake of brevity just follow along at the rconfig installation guide starting with section rConfig web installation and follow along to the end. We’ll get into setting up devices in a later post, but it is a pretty simple process if you are used to working with networking command lines.

Quick How To: A restart from a previous installation or update is pending.

Just a quickie from an issue I ran into today trying to upgrade vCenter 5.5 to Update 3, or at least the SSO component of it. Immediately after running the installer I was presented with an MSI error “A restart from a previous installation or update is pending. Please restart your system before you run vCenter Single Sign-On installer.” Trying to be a good little SysAdmin I dutifully rebooted, repeatedly, each having no effect on the issue. I’ve seen different versions of this error in the past so I had an idea of where to go but it seems to require googling each time. This is caused by there being data present in the “PendingFileRenameOperations” value of the HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager key. Simply checking this key and clearing out any data within will remove the flag and allow the installation to proceed.

In this case I had an HP print driver doing what they do best and gumming up the works. I’d love to say this is the first time I’ve been done in by a print driver but you all would know I’m lying. 🙂

Proud to be a Veeam Vanguard

On July 27th Rick Vanover over on the Veeam Blog announced the inaugural class of what is known as the Veeam Vanguard of which I am honored to have been selected as a member. What the heck is a Veeam Vanguard? While best described in Rick’s announcement blog post, my take is that this group is composed of members of the IT and virtualization global community who are Veeam users and go above and beyond in sharing their knowledge of the ins and outs of the various Veeam products.  Frankly I am flabbergasted to be named and wish to thank them for the nomination.

Without getting too gushy or fanboyish, I have found over the years that Veeam’s products tend to solve problems we all deal with in a virtualized world. Backup & Replication especially had made my day in, day out life easier because I know my data is nice and protected and I can test just about anything I want to do without effecting the production environment.

In closing I just want to say congrats to all of the other nominees and that I look forward to seeing what you have to share. To say the group is geographically diverse is an understatement as Veeam was ever so nice to include the nationalities of all members, it’s very cool to see so many flags represented. Many included I’ve followed on twitter and the blogspace for quite some time, while are others are new to me but in the end I’m sure there will be some great knowledge shared and I look forward to getting to know you.

Setting Up Endpoint Backup Access to Backup & Replication 8 Update 2 Repositories

A part of the Veeam Backup & Replication 8 Update 2 Release is the ability to allow users to target repositories specified in your Backup Infrastructure as targets for Endpoint Backup. While this is just one of many, many fixes and upgrades (hello vSphere 6!) in Update 2 this one is important for those looking to use Endpoint Backup in the enterprise as it allows for centralized storage and management and equally important is you also get e-mail notifications on these jobs.

Once the update is installed you’ll have to decide what repository or repositories will be available to Endpoint Backup and provide permissions for users to access them. By default every Backup Repository Denies Endpoint Backup access to everyone. To change this for one or more repositories you’ll need to:

  1. Access the Backup Repositories section under Backup Infrastructure, then right click a repository and choose “Permissions.”
  2. Once there you have three options for each repository in regards to Endpoint permissions; Deny to everyone (default), Allow to everyone, and Allow to the following users or groups only. This last option is the most granular and what I use, even if just to select a large group. In the example shown I’ve provided access to the Domain Admins group.
  3. You will also notice that I’ve chosen to encrypt any backups stored in the repository, a nice feature as well of Veeam Backup & Replication 8.

Also of note is that no user will be able to select a repository until they have access to it. In setting up the Endpoint Backup job when the Veeam server is specified you are given the option to supply credentials there so you may choose to use alternate credentials so that the end users themselves don’t actually have to have access to the destination.

Getting Started with Veeam Endpoint Backup

This week Veeam Software officially released their new Endpoint Backup Free product introduced at VeeamON last October after a few months of beta testing. The target for this product is to allow image based backup of individual physical machines, namely workstations, allowing for Change Block Tracking much like users of their more mature Backup & Replication product have been used to in virtualized environments. Further Veeam has made a commitment that in the product is and should always be freely available making it possible for anybody to perform what is frankly enterprise level backup of their own computers with no cost other than possibly a external USB drive to store the backup data.  I’ve been using the product throughout the beta process and in this post I’ll outline some of the options and features and review how to get started with the product.

Also released this month by Veeam is the related Update 2 for Backup & Replication 8. This update in this case allows a Backup Repository to be selected as a target for your Endpoint Backup job after some configuration as shown here. Keep in mind if you are wanting to backup to local USB or a network share this isn’t necessary but if you are already a B&R user this will make managing these backups much better.

Getting Started with Installation

Your installation optionsI have to say Veeam did very well keeping the complexity under the water in this one. Once downloaded and run the installation choices consist completely of one checkbox and one button. That’s it. Veeam Endpoint Backup seems to rely on a local SQL Server Express installation to provide backend services just like the bigger Backup & Replication install but it is installed on the fly. I have found that if there is pending Windows Updates to complete the installer will prompt you to restart prior to continuing to configuring your backup.

Configuring the Job

Once the installation is complete the installer will take you directly into configuring the backup as long as you are backing up to an external storage device. If you plan to use a network share or Veeam Backup Repository you will need to skip the step and configure the job once in the application. Essentially you have the following options:

  • What you wantto backup
    • Entire computer; which is image based backup
    • Specific volumes
    • File level backup
  • Where you want to back it up to (each will generate another step or two in the wizard)
    • Local storage
    • A shared folder
    • Veeam Backup & Replication repository
  • Schedule or trigger for backups
    • Daily at a a specific time
    • Trigger a backup on a lock, log off or when the backup target is connected


Personally I use one of three setups depending on the scenario. For personal computers I use a external USB drive triggered on when the backup target is available but set so that it never backs up more than once every 24 hours. In the enterprise using Endpoint Backup to deal with those few remaining non-virtualized Windows servers these are configured to backup to a Veeam Backup Repository on a daily schedule. Finally I will soon begin rolling this out to key enterprise laptop users and there backup will be to a B&R Repository as well but triggered on the user locking the workstation with a 24 hour hold down. Keep in mind all of these options can be tweaked via the Configure backup button in the Veeam Endpoint Backup Control Panel.

media-createCreating the Recovery Media

The last step of installing/configuring Endpoint Backup is to create the restore media. This creates a handy disk or ISO that you can boot off of to allow you to do a Bare Metal (or Bare VM :)) recovery of the machine. From an enterprise standpoint if you are rolling Endpoint Backup out to a fieldful of like machines I really can’t find a good reason to create more than one of these per model of device. Personally I’ve been creating the ISOs for each model and using it in conjunction with a Zalman VE-300 based external hard drive to keep from having lots of discs/pen drives around. If you are using this to backup physical servers it would also be a first step to being able to quickly restore to a VM if that is part of your disaster recovery plan.

As a trick what I’ve found is I have installed the product on a VM for no other reason but to create the recovery media. This way I know I’ll have the drivers to boot to it if need be. Further once you boot to the recovery media you’ll find all kinds of little goodies that make it a good ISO to have available in your bag.

Conclusion

I’ve played with lots of options, both paid and free, over the years for backing up a physical computer on a regular basis and even setting the general Veeam fanboy type stuff aside, this is the slickest solution for this problem I’ve ever seen. The fact that it is free and integrates into my existing Enterprise solution are definitely major added bonuses, but even in a standalone, “I need to make backups of Grandma’s computer” situation it is a great choice. If you find you need a little help with getting started the Veeam has created a whole Endpoint Backup forum just for this product. My experience both here and with other products is that there is generally very quick response from very knowledgeable Veeam engineers, developers and end users happy to lend a hand.

Support Adobe Digital ID Signing with Automated Microsoft CA User Certificate Generation

Just a quick how to, wanting to document a task I have recently had need of. This process has a perquisite of you having a Microsoft Certificate Authority already available in your environment.

  1. Start > Run >mmc
    1. Add Remove Snap-ins and choose the following
      – Certificate Authority (when prompted add the name of your CA)
      – Certificate Templates
      – Group Policy Management
  2. In Certificate Templatesright click on “User” and choose “Duplicate Template”
    1. Set compatibility settings as needed. If you have a 2008 R2 pure Active Directory environment make it match. In terms of Certificate Recipient make it match the oldest OS you have in use.
    2. Under General Change the Name to something meaningful as you’ll be referencing it later.
    3. Under the Security Tab set Domain Users to have both Enroll and Autoenroll permissions
  3. In Certificate Authorityright click on the “Certificate Templates”subfolder and choose New> “Certificate Template to Issue”
    1. Choose your newly created Certificate Template
  4. In Group Policy Management we are going to do a couple of things; setup your domain for certificate auto enrollmentand also define registry settings for Adobe Acrobat and Acrobat Reader.
    1. In any GPO that will hit the users you wish to have certificates (Default Domain Policy for example) choose to edit.
    2. Navigate to User Configuration> Windows Settings> Security Settings> Public Key Policies
    3. Double click on Certificate Services Client- Auto-Enrollment and set
      – Configuration Model: Enabled
      – Check Renew expired certificates…
      – Check Update certificates that use certificate templates
      – Hit OK
  5. Digital Signature Verification PreferencesBy default Adobe Acrobat and Reader only recognize certificates that are signed by the usual public authorities as trusted, so you have to tell it to look at what is available in the local Windows Certificate Store. In Adobe Acrobat or Acrobat Reader you can do this in Preferences, under Signatures>Verification and enable “Validating Signatures” under Windows Integration. This can be cumbersome across the enterprise but luckily this data is saved in a registry key, which means that through Group Policy Preferences we can manage this setting.  The fix below will work for all Acrobat or Acrobat Reader versions 7 or later
    1. Select the GPO of your choice to edit (again, I recommend the Default Domain Policy) and navigate to User Configuration> Preferences> Registry
    2. Right click in the window New> Registry Item
    3. You will need to create an entry with the following attributes:
      – Hive: HKEY_CURRENT_USER
      – Key Path: Software\Adobe\product\versionnumber\Security\cASPKI\cMSCAPI_DirectoryProvider
      * (Example for Acrobat Pro 11: Software\Adobe\Adobe Acrobat\11.0\Security\cASPKI\cMSCAPI_DirectoryProvider)
      – Value name: iMSStoreTrusted
      – Value type: REG_DWORD
      – Value data: 60 (hexidecimal)
      – Hit OK
    4. gp-prefRepeat steps B & C for each product/version combination you have in your environment. For example, in our environment we only have one version of Reader, but 3 different major versions of Acrobat Pro, so I needed 4 variants of this key to cover each of them.

And that’s it! It will probably take a little while for these policy changes to naturally propagate, but once it does so it works very slickly. Once done you and your users will be able to use their generated certificate as a Digital ID to sign any documents with a digital signature field in a fillable form. Do keep in mind that while this will work and absolutely can and should be trusted within your organization, if you or your users are in need of this type of service between organizations you will probably want to call the fine folks at Verisign or Thawte.

To for more information check out

Quick Config: Install ClamAV & configure a daily scan on CentOS 6

I’m pretty well versed in the ways of Anti-Virus in Windows but I’ve wanted to get an AV engine installed on my Linux boxes for a while now. In looking around I’ve found a tried and true option in ClamAV and after a few stops and starts was able to get something usable. I’d still like to figure out how to have it send me a report by e-mail if it finds something but that’s for another day; I don’t have enough Linux in my environment to necessitate me putting the time in for that.

So with that here’s how to quickly get started.

Step 0: If not already there, install the EPEL repository

Step 1: Install ClamAV

Step 2: Perform the 1st update of ClamAV definitions (this will happen daily by default afterwards)

Step 3: Enable and Start Services

Step 4: Configure Daily Cron Job

I chose to have it scan the whole system and only report infected files, you may want to do differently

Enter the following:

Note the -i option tells it to only return infected files, the -r tells it to recursively search. You may want to add the –remove option as well to remove files that are seen as infected.

Step 6: Make Cron Job Executable

You can then kick of a manual scan if you’d like using

That’s it! pretty simple and all of your output will be logged daily to the /var/log/clamav/daily_clamscan.log file for review.

Top New Features in Veeam Backup & Replication v8

We are now a couple of months out from the release of version 8 of Veeam Software’s flagship product Backup & Replication. Since then we’ve seen the first patch release a couple of weeks after, almost a Veeam tradition, and I’ve had it deployed and running for a while now. In that time I’ve found a lot to really like in the new version.

End to End Encryption

Backup & Replication now has the ability to encrypt your backup data from the moment it leaves your production storage system, through the LAN and WAN traffic and once it is at rest, either on disk or tape. This encryption is protected by password stored both with humans as well as within the Enterprise Manager database keeping you from losing backups. Finally the encryption does not change ratios for either compression or deduplication of the backup data.

Resource Conservation Improvements

Quite a few of the new Backup & Replication features are geared towards keeping your RPO goals from getting in the way of production efficiency. First and foremost is the availability of Backup I/O Control, a feature that will monitor the latency of your production storage system and if measured metrics climb above a user defined level will throttle backup operations to return systems to acceptable levels.

On the networking side if you have redundant or other none production WAN links you now have the ability to specify preferred networks for backup data, with failover to production if it isn’t available. Further the WAN Accelerator for site to site backup copy and replication has been improved to allow for up to 3x what was seen in v7.

Cloud Connect

Both of the above features make this one possible. With this new version brings a new partnership opportunity where VARs and other cloud storage service providers have the ability to directly act as a repository for your backup data. These providers can then allow you to spin these backups up as part of a second offering or as part of a package. With this the need to own, manage and maintain the hardware for a DR site becomes much lighter and I personally believe this will be a big deal for many in the SMB space.

New Veeam Explorers for Recovery

Veeam has been phasing out the use of the U-AIR wizards for item level restore for a while but with v8 we now have the release of the Explorers for Active Directory, Microsoft SQL Server and Exchange. The Active Directory one is particularly of note because it not only allows you to restore a deleted AD item but do so with the password intact.  Transaction log backup for SQL servers is also now supported allowing for point in time restore. The Exchange option has a few new features but I especially like the option of recovering hard-deleted items.

These are frankly just the tip of the iceberg when it comes to the new features. For more on what’s new I recommend you checkout the What’s New documents for both Backup & Replication as well as for VeeamONE, Veeam’s virtualization infrastructure monitoring package.

 

Managing your vSphere 6 Environment

VMware released their long awaited version 6 of its vSphere 6 products today and as I’m sure you’ll be running out tomorrow to go update all your production environments….

Ok now that we’re done laughing what you probably are going to want to get into is getting your lab updated or built so you can work out the changes yourself, possibly using your EvalExperience licenses you got with VMUG Advantage? Once you get it up and running you’ll notice that a few things have changed from the administration point of view. In this post I’m going to take a quick look at the Management features of vSphere 6.

Platform Services Controller

One thing you’ll find right off is that many of the underlying vCenter services have now been lumped together into what they are calling the Platform Services Controller. These services include Single Sign-On, licensing and certificate management.  At installation you are given two options on how to deploy the PSC, either embedded, where the PSC always rides along with vCenter, or External where the PSC is installed on its own VM and each vCenter talks back to the central services controller.

There are a couple of design requirements here if you chose to go the embedded route. You can have a maximum of 8 embedded or external PSCs per Single Sign-On site, and if you choose to go the embedded route it will increase the minimum RAM required to 8 GB.

vSphere Web Client

As has been the trend VMware has spent some serious time improving the Web Client, this time focusing on loading time, login time and a more streamlined component layout. It is still Flash based, but still a bit better. Time will tell with this one.

vSphere Host Client

Is the death of the installable VI client we’ve been hearing about for years here? Yes but it’s been replaced with a new version that is to be used only for connecting to the hosts directly or Update Manager. No, the new C# client for vSphere 6 will function much in the same way as the 5.5 client, you will be able to manage your infrastructure fully with it, but in terms of editing virtual hardware you will only be able to do so fully on VMs version 5-8.* The good part about it is the new C# client is not version based, rather it can be used to manage hosts running hardware versions 8-11.

Multi-Site Content Library

This one is probably what I am most excited about. Instead of having to update the ISO datastore in each of your locations, as well as building or copying your base templates for each vCenter, with the Content Library you can create a repository for all of your ISOs, templates, vApps and scripts and that repository will automatically be synchronized across all sites and vCenter Servers.

Virtual Datacenters and Policy Based Management

These two are the ones that I frankly still need to dive deeper into.  The concept is that you create virtual datacenters, spanning multiple locations (both local and cloud service) and then use policy to define what resources are available and where when spinning up a VM.

Certificate Lifecycle Management

Finally on the management side a new command line interface has been added for managing both the VMware and third-party certificates. I recently used fellow vExpert Derek Seaman’s excellent tool and blog series to use Microsoft Certificate Services certs in my vSphere infrastructure, I have to believe this will make that process easier. As the documentation gets finalized I’ll provide a link to the docs for this here.

All in all it should be an exciting time for us virtualized folks, with lots of new toys and technology to try out.

*After the big Feb. 6 announcement VMware saw fit to let everybody know that there are major changes between what was there in the betas and what will be there in the GA build, this being one of them.