Tuesday, 27 July 2010

Email archiving, E-discovery and poor business practise

Just been reading this blogpost about a disgruntled customer of Mimosa's email archiving platform: http://blog.foreignkid.net/2009/11/mimosa-systems-nearpoint-horrible-company-and-relationship-stay-away/ , a very interesting read and it may be one sided but it does go to show how management of a situation could have led to a much better situation for Mimosa. When customer expectations are not being met, a company has a couple of options;

1. 'The customer has signed an agreement to purchase our software, they are cancelling the project so we will keep their money'.... shortsighted and greedy
2. 'Its the software that has caused this issue and over a protracted time period we can clearly see the solution doesn't work so lets agree to a settlement that will work for both of us' ..... This is the solution that should be taken by responsible software vendors.

It is fact that a software product will not always work correctly in a given situation, but as responsible businesses that hope to have other opportunities to work with the same or similar customers we shouldn't burn our bridges.


Disclaimer: the author works for Quest Software, who has an email archiving platform.

Sunday, 25 July 2010

The Question of Storage and IT

Over the past several years, something insidiously unstoppable has been growing in the Data Centres and server rooms around the globe, we shall call it storage capacity. In the early days storage capacity grew slowly; an application owner might require 10 or 20 gigabytes for their CRM application or their SQL database. The IT manager would, as always resist, ask for further justification; though as always, the storage would be procured and provisioned for the application, by connecting an additional disk drive or moving data to another server. Disk expansion in the early days was controlled by the physical capacity of the server or the attached storage unit, this limitation was physical and absolute in many cases, if the drive bays were full, larger drives might be installed in place of the existing disk drives and the array expanded once again. The next issue was often OS limitation preventing RAID arrays from being easily expanded and accessible by the OS, allocation restrictions imposed a restriction on data growth. If additional storage capacity cost more than the data was worth then the data didn’t get stored.
In traditional IT departments; email and database servers were often the worst consumers of ‘difficult’ storage, IT departments had a constant battle to keep these complex applications in check not to mention the storage being consumed. Many now ‘aged’ IT administrators have been involved in the thankless search for the largest mailboxes and the 'power' users who owned them, where disks were Direct Attached Storage (DAS) and the production Microsoft Exchange 5.5 server had stopped the Information Store Service because there was not sufficient space left to continue processing email data.

Technology had been evolving to solve this conundrum but the cost of the solution remained prohibitive until a few years ago.
The solution was the Storage Area Network (SAN); suitable for Databases and high transaction applications and Network Attached Storage (NAS) technology for file storage where transactions per second was not critical.
It is clear that SAN and NAS technology has revolutionised IT, capacity now is easier than ever before to acquire, provision and re-provision between servers that have a Host Bus Adapter (HBA) – though its flexibility made SAN/NAS more expensive than DAS.

SAN/NAS technology has meant storage can be evenly spread amongst the servers that need it most and the storage can be reclaimed and re-provisoned when any server requires capacity. This step is a large jump up from the scenario where a Server has a Disk array attached and only a single server can access a disk at any one time. There still remains an issue where the available capacity is exhausted. Some of you have seen what happens when the SAN runs out of space and another storage unit needs to be procured, the price can be astronomical in comparison to the individual disk drives consumers purchase for home use.

To this problem, storage vendors and clever developers came up with a further solution to still potentially inefficient storage assignments through DAS; which became expensive centrally managed storage, albeit easy to assign and manage SAN; till finally the concept of Storage Virtualisation was born.
Finally SAN provisioning became more flexible yet again, Storage administrators often over allocate the storage 'assigned' to a host machine to allow for data growth. Now SAN storage could be allocated and the OS would believe it had all the storage it desired, therefore removing OS limitations on resizing volumes as they would be set up right at the beginning without regard to the actual physical SAN capacity available. Many amongst you realise that allocating storage before it exists comes along with a risk (there can be no power without risk)?
Thin Provisioning relies on the fact the storage is generally over allocated to Servers (OS drives will have 5-10 GB at least unused to start with) and in most cases this storage might as well be available in the general storage pool until used, thin provisioning has a good use case but can be catastrophic if the underlying storage actually runs out of physical storage.

As illustrated, Technology has always solved the issue of storage in the enterprise, capacity keeps growing and the business keeps using the capacity available.
Once storage stopped being an issue, users started storing everything without consideration to cost, afterall it was easy enough to add more capacity.
Added to this primary usage of storage, backups suddenly went from being Production Server disks -> Backup Tape (two copies) to Production Server Disk -> DR server Disk -> Backup Staging Disk -> Backup Tape, often upto 4 times the initial data size and all of it had to be stored and this doesn’t take into account the built-in RAID level.
Despite the obvious benefits to storing data more than once, firstly its faster to recover, the secondary site is there incase the primary site fails, the staging disk speeds up the backup speed by allowing the 'Backup Agents' to write to Disk Cache for DeDuplication and serialisation, its still very expensive use of capacity.

What are we storing and why are we storing it is the question to ask? We are all living in the equivalent of a 'Hoarders' household, our IT networks and servers are chocked with information that is completely and utterly redundant and yet it all gets retained just in case, in case of what?, what is the use case of reusing / retrieving this data. Its understandable in Public sector or construction that data may need to be retained for a long period of time but most data in most businesses after 3 months becomes the equivalent of household trash, its most likely that the data will never ever be accessed again and how expensive is the infrastructure being used to store this Trash?

The SAN/NAS/TP places the issue of storage into the history books or did it just make a brand new issue, one that created a lot more revenue for storage companies and more cost for IT departments in Hardware, Energy and floorspace?


There is a solution and its easy! Chargeback on the capacity used by Business Unit and Spring cleaning of the file systems.

Wednesday, 14 July 2010

Hah! who says global warming isn't occurring, speak up?

Is it, Isn't it? should I care and can i do anything to slow it down? Its not the normal topic of what I blog about but I just had to put this up; NASA study confirms 2010 is the hottest year on record?

http://data.giss.nasa.gov/gistemp/paper/gistemp2010_draft0601.pdf

Unlike the politico's who like to deny there is any change and pointing to the cold Northern Hemisphere winter as proof, I would like to think that a NASA study is more accurate.

Saturday, 10 July 2010

Mac OS X slowdown and repair

Our 2.0ghz Aluminum Macbook has ben going slow for months, spinning beachball, lag and generally enough frustration that even my 9 year old son was complaining. If you are experiencing a similar poorly performing Mac, try this tip;

http://www.thexlab.com/faqs/durepairfns.html#Anchor-The-47857

Start up from your OS install disk and run a Disk Repair on the Startup disk, not a Repair Permissions, a Repair disk that is only available when starting from another disk.

In our case the disk repair found only a couple of errors but the difference in performance is amazing, we have a new Macbook on our hands as I couldn't believe the difference such a simple function provided.

Wednesday, 30 June 2010

Microsoft RIP KIN, you never really lived

On techcrunch today you may see that the Microsoft KIN is to die shortly, 'She' never really lived, was a Smart phone without apps on an expensive data contract. Microsoft, your competitors are eating your lunch.
http://www.mobilecrunch.com/2010/06/30/microsoft-puts-down-the-kin/

Wednesday, 19 May 2010

FIM 2010 and TEC2010

After reading Gil Kirkpatricks analysis of FIM 2010 and what the TEC2010 community what the community want Microsoft to do with the platform, (http://www.theexpertscommunity.com/item/view/id/4410) really interesting to see that customers really just want Microsoft to finish the basics of the platform.
Many customers and partners I talk to now are contemplating FIM 2010 or have it installed in their organisation but like SharePoint it is a blank canvas. My experience with blank canvas products is they don’t go far without significant investment and effort (money and development skills), in addition non-IT buy in can be difficult without a ‘hook’ to the non IT core of a business. This may come from the benefits realised by automating a provisioning process and removing potential for errors occurring in user creation.
If the chosen AD management product is not easy to configure and manage then the advantages are moot, spending money on MCS is not a return on an Identity and Access Management project.
Take a look at Quest Active Roles Server as an Active Directory management tool with real codeless configuration, http://www.quest.com/identity-management/provisioning.aspx

Thursday, 6 May 2010

'The nightmare that was'... configuring a Dell Latitude E6500 as a Wifi Access point

Recently as my previous post mentioned, I purchased a non 3G Apple iPad, the WiFi part is sufficient for almost all events I am involved with. This time though I was to be staying in a Marriot Hotel in KL, Malaysia where the Wifi access did not cover all the rooms and all I had was my Laptop (E6500). Now this Dell laptop with Windows 7 installed is no longer the most reliable due to my constant requirements to install and remove apps that I feel may have some use for me. But I do not think that this is the reason I had so much trouble setting it up as a Access point for my iPad to connect through.

Anyone who had been doing their research will be aware that the iPad does not support tethering over Bluetooth to a 3G phone, a couple of sites claim that this is possible, its not! At least not yet. The two remaining paths I had to choose was either a Mifi device (portable access point and 3G device) or using my laptop as an access point, the latter option being cheaper and what I finally went for.

Setting up my laptop to work as an access point is where my problems started. Firstly my laptop had been locked down to prevent ad-hoc networks being configured, this site helped me override the settings dictated to me by my corporate overseers even if it was temporary: http://www.mp3car.com/vbulletin/operating-system-optimization/114261-wireless-settings-need-help-circumventing-my-stupid-group-policy.html

There is a good thread along with the respective registry entries that need to be removed so an ad-hoc network can be configured on a Windows 7 PC.

Open regedit and delete the key HKLM\Software\Policies\Microsoft\Windows\Wireless

This worked for me but could not get my beloved iPad to connect and obtain an IP address, iPad would connect and fail to talk to the Windows machine, regardless if a Manual IP address was entered for both the PC and iPad, first attempt, Fail!

I kept reading as there must be a better way to manage Windows 7 Wireless networking than the dire built in functionality (the more I use Windows 7, the less I like it and I really did like it to start with).

Secondly I found another utility called Connectify and this little app seemed to solve my prayers: http://www.connectify.me/index.html

Installed Connectify and whammo, Wireless network is up and running but iPad still wont connect, reading a little further into the Connectify UI and its apparent that my Wifi Card Intel Wifi link 5300 ABG was only working as an Ad Hoc network and for some reason the iPad doesn’t want to connect to that either. Looking at the support list for Connectify, it appeared that the card I had should run in Access Point mode so a quick search and download from Intel got me around this next hump: http://downloadcenter.intel.com/SearchResult.aspx?lang=eng&ProductFamily=Wireless+Networking&ProductLine=Intel%C2%AE+WiFi+Products&ProductProduct=Intel%C2%AE+WiFi+Link+5300+and+Intel%C2%AE+WiFi+Link+5100+products

Installed the driver here and restarted the Laptop, Access point is running perfectly and iPad now connects to the internet via the Wifi on the Dell Laptop.

Now if only Apple would start to support tethering, even if it was via a cable then i life would be a lot easier, how else do I install those shiny applications and use Twitterific.

Friday, 30 April 2010

I'm sitting here at 31,000 feet above Australia writing this blog entry on my new Apple iPad and realising what an evolutionary device it is. The iPod touch was a half way point, not suitable for serious use but great for testing the waters of what can be done with the touch screen format with a solid operating system.
While in Los Angeles this past week I decided to either purchase a MacBook Pro or an iPad, the iPad won out for several reasons for which I'll explain in further detail. My initial feeling after purchasing a new 64 gb iPad was a little muted due to the requirement to upgrade iTunes, a huge download over a slow connection and then the difficulties of getting software for it without a US account which I was able to sort with some friendly help from my friend and colleague Dmitry Kagansky.

Observations:

Instant start up, the device is always ready to go, admittedly my MacBook at home sleeps and wakes easily within a few seconds and gives me a solid experience of what an appliance is (my terminology for devices that simply work as they should and don't impose kludge and restriction; toaster, kettle, MacBook).

iTunes appstore applications available cover every imaginable idea / requirement I have of a mobile device in this form factor, many are free too which is even better, including some very good note taking applications which are ideal with a larger screen 'iPod Touch'.

A single device for flights etc, I read tech PDF data sheets now wherever. There are many books available to read while flying or travelling.

Its small, the size of a Net book, but has more application for a traveller than a Net book due to the design and typical 'use case' of these types of devices.

iTunes ecosystem managing the Apple family of portable devices( I want to come back to an area of the iTunes ecosystem that I'm not happy about too!), makes purchasing and installing music, movies, TV shows, iBooks, audiobooks and applications easy and the choice is vast.

Games, games that are being released for the iPad are becoming very professional, with visual appeal well above the Sony PSP and the Nintendo DSi. It's my view that those platforms are going to disappear over the next few years; Development costs are high for what is arguably an inferior experience to what I can get on Apple devices now. Games on iTunes are priced accordingly to the likely usage on a portable device which tends to be jump in and jump out, as opposed to home consoles, whereas Sony and Nintendo and third parties still have to price their software and take account of marketing, packagings and the studio development costs.

I wouldn't be at all surprised if Apple don't rerelease an Apple TV based on the Arm A4 used in the iPad within 12 months, Apple are a very smart company with a series of successful launches now (some failures too; Apple TV, MacBook air etc) and will design a UI and control scheme that fits the market. At this stage in the 'game' Sony, Microsoft and Nintendo should be very worried! They are all companies with a lot to lose in the casual gaming market and living room and minds of consumers.


Onto iTunes, I have some reservations on what iTunes is becoming; it's too big and tries to do too much. Apple are controlling the experience and the revenue opportunity through the devices that use iTunes; I think there is a place for an independent mechanism for applications to be supported, or at the very least allowing access to the shared data locations on the iX devices (iPod, iPhone, iPad) as each application I have installed has it's own data area for Pages docs, PDF files, photos etc - not the best way to address user data by maintaining independent silos of information.

On top of this iTunes seems to be becoming too much! If I want to sync data to my iPad then why do I need to have the device connected and work through the iTunes menus to get to a dialog that allows me to copy data across unless I use iWork online! I should be able to configure a folder and share data through that folders or series of folders just as I do with my folders on my Mac.

Disclaimer; I am a long time Mac user but this doesn't get in the way of my understanding of where I am being taken my continuing to support the Apple ecosystem of DRM and micro transactions restricting my freedom to choose devices on what is best rather than who makes it

Monday, 22 March 2010

Systems administrators and Solutions Architechs

How many people have inflated their role within a company? I'm sure many of you have. I have called myself different titles with 'Senior' being the most common, Why? Nowdays I don't know - what was the impact of the title change? no one came to me with enhanced expectations because now I was a 'Senior Janitor' and no longer a mere 'Janitor'?
In IT there is a tide of self fluffing through role enhancement, from Senior Systems Adminstrators to Assistant IT Director through to the current favorite 'Solutions Architect' which may be the most mis-representative title to date for many self appointed 'Solutions Architects'.

Lets break out what an IT Architect is? someone who develops, designs, plans and understands the broader picture, an IT Architect can relate technology to anyone within a business; this role is not typically a detail person, rather a high level thinker that understands concepts and plans and relates them to technology.

The Architect fragments into various titles from Technical Architect (shouldn't an architect be technical anyway?) to an enterprise architect (high level person who recommends centralising systems and minimising application stacks?) to my favorite, the Solutions Architect.

What is a Solutions Architect? does a Solutions Architect look after one solution or many? what is the solution that an SA manages? for all I know this role works with Water and Aspirin and creates a solution by dropping aspirin into water?
Curently a Solutions Architect is the favorite title for an IT administrator to take and what does it mean? above you clearly see no customer is going to take you any more seriously than they would as a Systems administrator? its the proof of experience that makes a person suitable for a title not the title.

Personally I've given up, no longer will I be Senior (no grey hair), or will I be a Solution Architect, I am a Consultant (talk to customers and relate my experience to their requirements) and proud of it.