A manager's account of PlusNetters' pulling together during the floods, to ensure that customer service (and quality of) was maintained during the hardest hours of many people's lives.
Melise Jones, Head of Network Operations, gives us a timeline of how we as a business reacted through a period of increased pressure in June 2007 and how we managed to keep the business ticking over with very little customer impact.
Sheffield Floods June 2007
In keeping with the spirit of our open and honest communication with our Customers, we're publishing an account of our experiences and how we responded to the flood event which occurred in June 2007 in the Sheffield area. It was an event which took us all by surprise and we wanted to communicate not only how the flood affected us, but also wish to share some of the lessons we have learned in the process. We welcome any feedback or accounts of how you've coped since the event.
25JUN07
Sheffield was badly hit by flooding as ‘a sixth of the annual rainfall’ fell ‘in just 12 hours’ over the Yorkshire region.
PlusNet was up and continued to support Customers, but we were a few heads down as several staff members were unable to get into the centre of Sheffield. As a consequence our telephone response times were a little below normal, but we had staff members using secure remote access (VPN) to work remotely and help process customer tickets (questions raised via the website). At this time, we urged customers to kindly raise any issues via the Help Assistant on our website, rather than phoning in.
26JUN07
At PlusNet's Sheffield Headquarters we formed an Emergency Response Team (ERT) comprising of managers and technicians from the Customer Support Centre, Network Services, Facilities, Development and Product Development. The team met to assess the situation and to put an action plan into effect to deal with the impact the flooding was having and could have to PlusNet Services and Customer Delivery.
We ensured the Customer Support Centre had sufficient resource (remote and on-site) to continue to service customers', primarily through ticket response but also via the telephone. The majority of homeworkers were assigned to work tickets whilst those in attendance at PlusNet's HQ were manning the telephones.
We issued frequent Service Status messages to our Portal providing updates on our situation to our Customers.
With Customer Support under control, we turned our attention to our Disaster Recovery Plan. Whilst the flooding in Sheffield was isolated to certain areas, there was a real possibility that access to our main Customer Support Centre would be denied if the flooding became more wide-spread or severe. Therefore, we immediately activated our Disaster Recovery (DR) Site which is located in another part of Sheffield. We confirmed the site was safe, had adequate power and confirmed our generators were in good working order by failing over the site to generator power. We also arranged fuel top-ups for the generators should re-fuelling be required. Additionally, we activated our Call-Forwarding Plan for Customer Support and sent a small team of Technical Support Analysts and Network Operations Engineers (for Operations Support) to the DR Site.
On this day, we made a Community Site Posting and started our Flood Blog.
Our ERT agreed to attend twice-daily meetings to manage progress and any changes occurring during the flood event.
27JUN07
CSC had 8 staff and a Techinical Shift Manager attending the DR Site.
Today, the PlusNet Customer Community reacted to the Community Site Flood Blog and started adding their own photo's and personal accounts to the thread. By this time we had received many kind messages from our customers and we thank you for your support during this period.
PlusNet Staff whom can get in to work, maintain the 'Business As Usual' approach.
Other PlusNet Staff whom can't physically attend work continue to use our secure, remote access to work from home.
28JUN07
CSC had 8 staff and a Technical Shift Manager attending the DR Site.
PlusNet Staff whom can get in to work, maintain the 'Business As Usual' approach
Other PlusNet Staff whom can't physically attend work continue to use our secure, remote access to work from home.
On this day, we received information from Yorkshire Electricity that power to PlusNet's Headquarters in the City Centre may be diverted to other parts of the City. The ERT meets again to put together an Action Plan to address the logistics of Business Continuity in the event power is cut to the PlusNet HQ.
Fortunately Yorkshire Electricity didn't have to cut power to the Plusnet HQ, but the Disaster Recovery Plan we had in place meant we would've still been able to service Customers and keep our Business going.
29JUN07
When the ERT met this morning, an Action Plan was agreed to ensure the Business Continuity Plan carried through the weekend as more heavy rain was predicted for the area.
All Shift Managers from the Customer Support Centre were fully briefed on the DR Site access plan in the event Customer Support Centre Operations needed to relocate to the DR Site.
Additional PlusNet Developers and Network Operations staff were put on
alert in the event these resources were required over the weekend.
02JUL07
Fortunately, weather conditions improved over the weekend. Power remained stable and the PlusNet HQ facilities remained dry and accessible. The Action Plan to relocate operations to the DR Site was successful in that we had the ability to re-deploy Customer Support Centre staff and telephony infrastructure to the alternative location in order to continue business operations, should that need have arisen.
We kept the Customers informed through frequent Service Status Updates and the Community Site Flood Blog.
--
Ironically, PlusNet had already been in the process of reviewing and updating its Business Continuity Plan. The events of the Sheffield Flood accelerated it's testing and implementation very dramatically! We did however verify our ability to keep our main business operations up and running during a period of time when access to our main place of business was degraded (and in some cases denied).
The Customer Support Centre which is our most important business function was able to continue with little or no degradation to quality or our ability to effectively service our Customers. Through existing communications channels (Service Status, Community Site & Telephony), we were able to continue our dialogue with Customers and provide them with updates on the impact the event was having on the Business and our ability to support our Customer Community.
What did we learn?
- You can never over prepare for an event which may remove your ability to keep your business going.
- We have two fully resilient Sheffield-based Datacentres where our Services Platform is hosted along with our London-based Points Of Presence (PoPs) where our ADSL and Dial Services are hosted.
- If the Ulley Reservoir had been breached, our Customers' broadband services would've been completely unaffected, but our Services Platform could potentially have been seriously degraded. As a result, we've brought our plans forward to relocate one of our Data Centres outside the Sheffield area.
- Our Disaster Recovery plan, which coincidentally was being updated at the time of the event, was put into play and worked as we needed it to. Our Staff knew what to do, where to be and the lines of communication were kept open to tell our Customers what we were doing.
Links:
-
Forum Thread with Photos / Videos from Staff & Customers
-
Flood Blog Post