Tuesday, February 19, 2013

I need Aspirin for the Email Archive Headache: 3rd party; Exchange; None?


3rd Party Solutions

Back in the olden days you had to get a 3rd party solution to help with Single Instance Storage (SIS). The object was to keep the Exchange mailbox database stores small. Then the SIS storage area transformed into an Email Archive some Legal Eagle could rummage through. This was all easy to justify since storage was sky high in price and Exchange stores took forever to backup and restore, blah, blah, blah. We all know those stories.


Exchange Personal Archives

Then Exchange Personal Archive enters the playground. This is meant for a user to just move their existing mail to a secondary mailbox. That's different from the 3rd party solutions which replace the message with a Stub or Shortcut. The Personal Archive removes the message from the mailbox, keeping it's size down. There are a lot of benefits to having less items in the mailbox and having policies for auto archiving. But it doesn't resolve the issue of space. You have to put that secondary mailbox somewhere. It will still take up space.

Cheap Storage

Add "Cheap Storage" to the playground and now you got a real confusion! You can get super responsive storage, storage that compresses the data to half it's original size. All for a fraction of the cost of the Olden Days storage.

So what?

Given all these choices what's an Exchange Administrator to do? Storage costs are going down. And 3rd party licensing fees go up. Then those pesky maintenance fees. Sheesh! It's enough to start thinking storage is cheaper than any 3rd party archiving solution. You know what? In many cases, it might be!


Please sir, may I have "None of the above?"

We asked ourselves one important question: "Who controls the email?" That question may seem very easy to answer, but think about it. Are you a company subject to Sarbanes or HIPAA or some such? Then the answer may be "The Company " If you don't fall under that umbrella, then the control may be in the users hands.

I'm not talking about ownership. "The Company" owns the data for sure. "The Company" may even have policies in place that limit, by date, how much data a user can have. But the user controls how long the allowed data lives. I may delete a message tomorrow, or I may elect to keep it forever. Many users reply on CYA and keep everything forever. You know, "Just in case."

Where I work we have many Email projects coming together: PST Migration, Email Archiving, Discovery, to name a few. Retention was a big discussion. One executive wanted to be certain that we never kept anything the user did not keep. If the user deletes it, we need to make sure we don't keep it.
We needed a new product that could help with all this. We decided (well, the "Powers That Be" decided) we don't need to keep backups of any email for more than two weeks and any historical data was decided by the user. (We don't have any time limits on mail, that is a battle for another time.)

So any email discovery via an Email Archive was out. We suddenly did not need that product for that reason anymore. And storage was so cheap, it was not a problem to throw 16T at the mail system. Piece of cake. We suddenly did not need the product to compress the data anymore either.

Now it's just a matter of moving the archived data back into the mailboxes for 25,000 users. I'll be busy for a while. I'll just look at it as job security.

Issues we face

We started moving users PST files into an Personal Archive because they were crashing the file servers with their 30,000 PST files. In those PST files were messages which are shortcuts to messages that were really in the Email Archive. And we put all those messages into the Personal Archive of the user.

I have yet to find a 3rd party archiving solution that can reach a secondary mailbox and replace mail there with a shortcut. I hear you asking ; "Why would you want to?" --  Because you also may want to the reverse: Replace the shortcut with the original message.

As I embark on this new bigger than the universe project I will post from time to time challenges we face. The first one is to replace all the stubs in all the mailbox with the original message. Then move on to  the Personal Archive. Not sure about that as yet.

This blog is about powershell.  In the next post I'll show how I used powershell to help me find, keep track of users, and find potential problem users. Creating many databases, finding the smallest database, etc... Lots of work to do .



Monday, January 7, 2013

Mailbox Database FailOver Dance: Mail queues backing up

We had an issue where the cluster service failed on a DAG member and the so the cluster started to complain about membership and so on and so on...

The database tried to come up in several places, then it finial settled on Dag Member #2.

This pretty much happened within 30 seconds. A long time in my book for Exchange mailbox database fail overs. And about 30 minutes later we started to be alerted about mail queues backing up. There were over 200 messages waiting to go to a particular database.

I had never seen this before and wondered if there was an AD replication issue. The error clearly pointed out that the database location and the user's location did not agree!


get-queue Site2HUB\Submission | get-message | fl

<snip>
LastError         : 432 4.2.0 STOREDRV.Deliver.Exception:WrongServerException.MapiExceptionMailboxInTransit; Failed to
                    process message due to a transient exception with message The user and the mailbox are in different
                     Active Directory sites.


So off i go to see the administrator in charge of AD. I wanted them to check the replication and maybe even reboot the server. Even if indeed they didn't find anything, because by golly, I was right!

To their credit they refused to boot the server. And I went back to the drawing board.

In an effort to prove I was right, I actually proved I was wrong.

I found a stuck message, and then grabbed a recipient and searched for their database and where is was mounted:

get-mailbox <username> -DomainController Site1-dc
Name                      Alias                ServerName       ProhibitSendQuota
----                      -----                ----------       -----------------
<blah>                    <blah>               MBX02           39.06 MB 

get-mailbox <username> -DomainController Site2-dc
Name                      Alias                ServerName       ProhibitSendQuota
----                      -----                ----------       -----------------
<blah>                    <blah>               MBX02           39.06 MB 

Both AD Sites showed same Server MBX02, for the user.
But the database was mounted in a different place.

[PS] C:\Windows\system32>Get-MailboxDatabase <blah's DB>

Name                           Server          Recovery        ReplicationType
----                           ------          --------        ---------------
<blah's DB>                    MBX05          False           Remote

I moved the database to MBX02 to make it agree with AD and all messages cleared.

My guess is I could have moved it to any of the other 3 servers and it would have worked.

So I was right, this was an AD issue, just I was wrong about it being a replication one, ;)  

It was the way Exchange wrote those AD entries during all the bouncing around at 10:00 AM. Moving the database made Exchange rewrite those entries.


Monday, December 3, 2012

Mailbox Move error:Couldn't switch the mailbox into Sync Source mode

I've been working on a project to import PST files for quite some time. Some users we've helped have had many PST files and some had very large files.

One of our Archive Mailbox databases became very large for our tastes, so we decided it was time to trim all these databases down to 100G each from 380G each. Our main issue with the size was a place to restore. The time it takes to restore the database wasn't a problem, these were archived messages and the SLA was a full day. The problem was the 400G chunk of free space needed to do the restore. That was hard to find.

So as we moved archive mailboxes, I came across this error from time to time. The first time I saw this error, we were dealing with a user's mailbox that had over 22,000 folders. So I just exported her archive mailbox into several pieces and then imported it into another database. I just assumed that was a one time issue, and just moved on to my next task.

Then it happened again. This time is was on an archive mailbox 35G in size. I started searching on the error and found very little. Here's the error:

==============================================================
FailureCode                      : -2146233088
FailureType                      : SourceMailboxAlreadyBeingMovedTransientException
FailureSide                      : Source
Message                          : Error: Couldn't switch the mailbox into Sync Source mode.
                                   This could be because of one of the following reasons:
                                     Another administrator is currently moving the mailbox.
                                     The mailbox is locked.
                                     The Microsoft Exchange Mailbox Replication service (MRS) doesn't have the correct
                                   permissions.
                                     Network errors are preventing MRS from cleanly closing its session with the Mailbo
                                   x server. If this is the case, MRS may continue to encounter this error for up to 2
                                   hours - this duration is controlled by the TCP KeepAlive settings on the Mailbox ser
                                   ver.
                                   Wait for the mailbox to be released before attempting to move this mailbox again.
FailureTimestamp                 : 9/15/2012 6:24:00 PM
FailureContext                   : --------
                                   Operation: IMailbox.SetInTransitStatus
                                   OperationSide: Source
                                   Archive ()
                                   Status: SyncSource
==============================================================

I finally called Microsoft Support who said "This is a very common issue. Here's what you need to do... Change the TCP KeepAliveTime for the servers involved in the move. Source, Target, and CAS."
It turned out that I had to do that so much, I wrote a simple script to make the changes for me easily.

Get the code here: http://poshcode.org/3808


Enterprise Wide PST Import - The Functions

This is a Part 12 in a series of posts about my experience tackling the migration of PST files.
The first post in the series is here.
(This is the last post in this series.)


The PST Utility Module is really just a collection of functions. Here's a list of the functions and short description of how they are used.

Get the module here.

Queue Processing:

These queue processing functions are used to do the work of Importing PST into Archive Mailboxes as well as maintenance of the queues.

Add-PSTIMportQueue
Given a user name, find their PST files, set up the job queues for the PST files, and notify the user and admin with an initial report. A more detailed explanation is here.

Process-PSTImportQueue

Loop through each job and take appropriate action until the job goes through all the stages of: Copied,  Imported, Cleaned up, and Notified. A more detailed explanation is here.

Set-PSTImportQueue

Set different attributes on a user’s jobs, or a single job. A more detailed explanation is here.

Remove-PSTImportQueue
Remove a user or a job from the queue.

Report-PSTDiscovery
Finds all PST files for All Mailbox Users and tracks their usage – weekly run

Report-PSTOverallSummary
Generates a daily detailed report in an HTML file – also sends a summary report to the Admin and Boss.


Admin Functions:

These functions are meant for Admins to run. They can help the Admin decide how to process a user's PST files.


Check-ForPST
Same as the Get-PSTFileList above except input is geared to an Admin and not a Job Object


Get-ArchiveUserInfo
A quick report about a user, do they have an Archive mailbox already, GPO, etc

Get-ImportStatus
A quick way to check on import queues and suspend and restart, etc

Move-PSTs
This checks for and moves a users PST file from the Home Share to their local PC

New-OnlineArchiveByUser
This adds the Archive Mailbox to a user, setting all the defaults. For people without PST files to import

Optimize-PSTImportQueue
Sort Users by number of PST files, so the most number of users get done quickly


Helper Functions

Add-ToGPO
Add a user to a group we use to control Disallow PST Growth

Adjust-Quota
Sets quota to default for Online PST users

CC
Counts collections 0, 1, many

ConvertTo-ComputerName
Returns the computername for an IP

ConvertTo-IP
Returns the IP for a given Comptername

Clean-OnlinePSTGPO
In our world, there is a distinct set of rules for the GPO, we check them here. Add people who are missing and remove people who should not be there.

Copy-PSTusingBITS
Uses BITS to copy the PST file to the share for processing. From PC’s and far away sites

Copy-PSTtoImportShare
Uses copy-item to copy the PST file to the share for processing. Used for local AD site files

Format-JobStatus
Returns the job status based on user or overall

Get-ISODayOfWeek
Returns the day of the week for a given date

Get-FreeDiskSpace
Returns the free space of a given computer drive – used when moving PST files back to a local computer.

Get-ClientAccessUserLog
Returns raw info on a user extracted from RPC logs on CAS servers

Get-MRSServer
Returns the MRS Server to use for this move

Get-OutlookEXEVersion
Returns the version of Outlook installed on a computer by looking at the EXE

Get-SingleMailboxObject
Returns a mailbox object, if results are anything other than a single entry, it returns nothing

Get-PSTFileList
Return the PST file collection from Home Shares or PC or some directory

Get-ArchDB
Returns the best candidate mailbox database for the Import. By smallest mailbox database size.

isMemberof
Tests the membership of a group

Import-PSTFromShare
Starts the import process using the Job Object for settings

Lock-PSTIQ
Creates a zero length file to signal processing is happening

Move-PSTToLocal
This moves a PST file from a Home Share to the local PC – if all test conditions are true

New-ArchPSTFileLogObject
Creates a new object for logging the PST file activity of Users with Archive Mailboxes

New-ArchPSTIndex
Finds or creates a new Index entry for a PST file. Used to search the PST related Filelog Objects

New-PSTFileLogObject
Creates a new object for Logging all PST files for all users

New-PSTFileIndex
Finds or creates a new index entry for a PST file object for All Users

New-PSTJobObject
Creates the Job object used in PST import Jobs

New-PSTReportObject
Creates object used in PST Reporting

New-OnlineArchiveByJob
Gives the user an Archive Mailbox, using the Job Object passed. Only used in Add-PstImportQueue

New-TargetPSTDirectory
Creates a new directory in the share using the User’s log in name

Reset-PSTBackupFileNames
Resets all Backup filenames to Now() using the Format ‘<filename>-yyyyMMddHHmmss.<ext.>’

Send-NotificationInitialReport
When add-pst is run, and discovers PST files to process, a summary is sent to the user.

Send-NotificationFinalReport
When processing is finished, the results are sent to the user.

Send-NotificationPSTRemoval
When reports are run, a message is sent to the user about still connected PST files (14 day cycle)

Test-OutlookClientVersion
A way to control what client version are accepted for import

Test-PSTIQLock
Test if the Queue is locked, returns true/false

Unlock-PSTIQ
Removes the lock – when the processing pass is done










Enterprise Wide PST Import - Get the script / Set up

This is a Part 11 in a series of posts about my experience tackling the migration of PST files.
The first post in the series is here.
The next post in the series is here.

I put these scripts together over time and having to go back and look at all the moving parts makes me realize how much time went into this.

Get the module here.

So follow these few simple steps and you're ready to import PST files!


First you'll need the permissions:

Granting the permission to do PST file Import and Exports is explained here. Just assign yourself the role.

As for finding and copying PST files, well you might need a little more umph. I have an account that has Domain Admin privilages. That account is for Admin duties only. (I have separate 'normal' account I use for everything else.) My admin account also gives me access to all the computers where PST files may be hiding, since Domain Admins is in the Local Administrators group on each computer.

You could use a service account which has all these privilages and run these functions using that account. I tend to not like using service accounts because you lose accountability.

Create the Share:

Exchange PST file import and export cmdlets require a share. Find a server that can handle the amount of PST files you plan to import and create a share. Grant the "Exchange Trusted SubSystem" group full access to this share. We named our share "PSTImports."  It's why all the folder structuture begins there.

We were lucky enough to have a utility server lying around that has 1TB of space on one of the drives. We use the server for reporting and performance history. And there was plenty of free space to handle importing 800G of PST files in one pass. We never got that high, and we even added pieces into the script that examines the free space of our server to choke PST copies in case we ran short of space. You should adjust this setting for your server. It is $Script:FreeSpaceLimit and is set in MB.

You'll want to create some other directories within this share:
  • ^Completed
  • ZZSavedQueueLogs
Other directories are created during the import for each user being processed to hold their PST files. After the Import, the PST files are replaced with results of the Import in CSV, then moved to the ^Completed directory for historical reasons. It's named with the leading ^ for sorting purposes. It will always be at the top ;)

ZZSavedQueueLogs is just a backup queue files. A backup is made each time you run the functions. I can't tell you how many times having those backed up has saved me some extra work.
I check this directory -- when I think about it -- and kill any older stuff.

Add PSTUtils As A Module and Optionally add to profiles:

Log on the server with the account that has Local Administrator rights and create this directory:
C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSTUtility
And copy the PSTUtility module there.

When you load up powershell then "get-module -listavailable", you should see the PSTUtility as a module you can import. Just use the command "import-module PSTUtility"

Add this to your $profile so you have it available each time you log in.
I use this command all the time: "notepad $profile" make my changes and then save. Restart powershell to load the new profile changes and you are ready to go.

Edit the Module: look over and change the $Script: level variables to suit your particular environment.

Configure Initial IP/Computer searches: (Optional)

When we get a request for a user's PST files to be imported, we usually just get a name. I'd love to have the computer name so I can determine if the Outlook Version on that computer will indeed support the Archive Mailbox feature. It's not always supplied. So we ask. "I don't know. He took his latop with him and he is gone for a few days." Let's just say, it's not always easy to get. Even if you ak the users and tell them how to find it, its a hit and miss proposition.

One can find a user's log in information in the RPC logs of your CAS servers. But theres a lot of information in those logs ... lots!  So I wrote a small utility that grabs what I call 'Connect Logs' - this is an subset of the RCP logs where OUTLOOK.EXE does a status of "Connect."  This entry holds the Version of Outlook connecting, legacyExchangeDN, the Mode (Cached or Classic), and the IP address. Looking thru this subset of logs vs the full logs is a time saver when looking up 50 people and trying to find the information on them.

Of course, users don't always 'connect' every day. They leave Outlook running for days and days. So their connect log may be there but it was last Thursday. I tried various combinations of how to search for this data and how far back to look etc.

I finaly just decided to get 30 days worth of these connect entries and use those for all my reporting and PST file imports. I have a script that runs every night at 2AM and gathers this subset of logs. It's "Get-CASConnectLogEntries" And you can copy that piece out, or add the 'import-module PSTUtils" to the profile of whatever is running the script. I have it running seperate in a batch file I have for overnight reports.

The directory location I keep these is in a different place where I keep all my reporting information. You can of course modify the variable and keep it where you like. I use these connect files for other reports too. There's lots of information in those logs.

I should add, running Get-CASConnectLogEntries at 2 AM is optional. If you know the computername each time you do Add-PSTImportQueue, you can bypass setting all this up. When you know the computer name, you can query the computer for all the information you need.

That's all there is to setting up everything.

Once you have these pieces in place, the hard part starts. Finding people to migrate to the Archive Mailbox.

Next, I'll be posting a list of all the Functions and how I use them.



Introduction: The Beginings
Part 1: Script Requirements
Part 2: Add-PSTImportQueue
Part 3: Process-PSTImportQueue
Part 4: Some Tools we added
Part 5: Set-PSTImportQueue
Part 6: About PST Capture
Part 7: More PST Import Tools
Part 8: Using RoboCopy
Part 9: Morning Status Report
Part 10: Using BITS Transfer
Part 11: Get the script / Set up
Part 12: The Functions

Wednesday, September 19, 2012

Report-RecipientCounts - And Newborn Kittens

Yesterday some lady sent me an email -- at work -- and she needed to know if I wanted some of her newborn kittens. At first, I'm like: "Duh! As if!"

I don't know this woman, or care too. I just shrugged it off.

Then I heard my fellow admins cringe or moan one after the other the same email. Then someone piped up, "How many people did this go to?"

Crap! I should have seen this coming and just as I'm finishing up my trace message commands, into my cube walks one of the "Powers-That-Be" wanna bes.

"How can this happen? I though we had this locked down so no one could send a message to everyone in the company!" Crap! She sent it to everyone? No wonder my command was taking longer than I had expected. When my report came back, she had sent it 3 times, not just once. The first 2 times she tried some groups and found out she did have the permissions needed to send to the larger groups.

But new born kittens are important and the news must get out! She pulled up the address book, clicked the 1st person, then <ctl><a> and had everyone selected and put them on the To line.

This really happened. (Well, not exactly like this but very close.)

Protection from the Kitten Email Flood

You protect yourself with Default Recipient count maximums:
(Get-TransportConfig).MaxRecipientEnvelopeLimit - We had ours set to 4000.

In the Olden Days (Exchange 2003) the way recipients were counted is different than today in Exchange 2010. Back in 2003, the groups were all expanded before there was a count of users. But there were issues with doing the expansion before the count, I gather -- something about expanding too many groups and taking forever to finally deliver. I can't remember all of them. So the Microsoft Exchange Boys changed the way the count of users was made. Distribution Groups were now just counted as one. So if I sent a message to myself and a group containing 50 people, the recipient count would be 2.

Our 4000 setting was left over from those bygone days of 2003. 4000 was just fine for our world of 25,000 users. Many people sent to groups and those recipients ultimately numbered high in the 1000's but never as high as 4000. The setting was just ported over to 2007 and then to 2010. We didn't have any reason to take notice.
Until now.

So Make A Report Already!

So the burning question of the day went from "How did this happen?" to "Can we set the Max to 20 recipients?" -- well that's a bad thing too. People must work. In my book, you slap the hand of the person who did the bad deed and move on. I convienced the Power-Wanna-Be that we had to change the number, true, but we need to know what a good number is.

Then they asked "What is normally the highest number of recipients sent to each day? Each week?"
I had to say, "I do not know ... But I can find out."

So I wrote this script which keeps a running tab of recipient counts.
You can get it here.


Monday, August 13, 2012

Enterprise Wide PST Import - Using BITS Transfer

This is a Part 10 in a series of posts about my experience tackling the migration of PST files.
The first post in the series is here.
The next post in the series is here.

Times Change
Our original goal was to remove all PST files from our user's Home shares and into their Exchange 2010 Archive Mailbox. This has expanded to include all PST files everywhere now. I understand the legal discovery benefit. There is also a Help Desk benefit because users are always calling in about this PST file or that PST file. It's just better not to have PST files at all. So the Powers-That-Be modified our goal to include all PST files no matter where they might be hiding.


The New Challenge
When asked, we were importing PST files for people who were not at HQ as a courtesy. This was limited to a small group of people. While migrating those users, I hit the barrier of bandwidth. Some locations just don't have that much to spare.

Starting off, I just tried to do a conventional OS copy, which didn't work so well. There were several cases where the network team was called in because something "clogging up the WAN and it was very slow."

I tried RoboCopy and posted about using RoboCopy here for pulling the PST files back across the WAN line. Mainly because I knew it had the feature to pause and allow other traffic to move. Seemed like the prefect solution. But there were issues on my end. Like using a conventional OS copy, RoboCopy was always running in the current session of Powershell, or some spawned Powershell session. If you forgot that and exited the session, or logged off the server, the copy stopped.

BITS version
I saw a post on Powershell.com (here) about using BITS transfer and started to investigate. In a few tests BITS performed as advertised. The WAN did not suffer -- the network and server teams had already configured BITS on all PC and servers to operate in a "friendly to the WAN" way. I was just hopping on their bandwagon.

So BITS is my answer for the file transfer. If I rebooted my utility server, the job was still there. If the user rebooted their PC the job was still there.

There are still issues; The PST files are in use nearly all the time, The PC goes to sleep, Users carry their laptop home. Many are all normal problems that can't be scripted. ;)

PST files being "In Use"
We either wait until the user goes home for the day and tell them to shut down Outlook and pray they do. Or we talk them into disconnecting the PST files. If the users want the PST files imported, they usually are very willing to help.

Disconnecting PST files
Users are not to savvy about disconnecting PST files. In many cases we have to help. Usually when we get to this stage, there are many reasons we are helping them. They just don't ever get this kind of experience and don't have a clue. Someone set up PST files for them and they just use them as folders.

PC going to sleep
This requires some hands on. We can send the instructions to the user if they are a local admin on their PC, but many are not. We talk with the Local Technician and ask if they can help us by turning off that feature for a period of time. Or we just log into the machine to do that remotely.
(Will this be my next Powershell script to turn off power save mode and turning it back on, with the PC not having Powershell installed? I should look into that...)


Status Update
This project is moving along smoothly and we've finished almost 50% of the PST on Home Shares. I think I am about done posting about this project, most of what is happening now is just routine. Soon I'll be posting the Module which has all the code. I hope to post that in the next two weeks to a month.



Introduction: The Beginings
Part 1: Script Requirements
Part 2: Add-PSTImportQueue
Part 3: Process-PSTImportQueue
Part 4: Some Tools we added
Part 5: Set-PSTImportQueue
Part 6: About PST Capture
Part 7: More PST Import Tools
Part 8: Using RoboCopy
Part 9: Morning Status Report
Part 10: Using BITS Transfer
Part 11: Get the script / Set up
Part 12: The Functions