Accomplished professional with expertise in C#, ASP.NET, and Java, alongside strong skills in VMWare, AWS, AZC, GCP, and SQL databases. Proven leadership and communication abilities enhance team collaboration and project outcomes. Experienced in applying methodologies like Prince 2 and UML to ensure project success. Proficient in using Git and Jenkins to optimize development workflows.
Overview
46
46
years of professional experience
21
21
years of post-secondary education
Work history
Infrastructure and Operations Manager
Tangent
London
08.2018 - 08.2025
Configured and maintained IaaS, PaaS, and SaaS services across AWS, Azure, and GCP.
Managed domains and SSL certificates to ensure secure access.
Investigated and triaged production issues to minimise downtime.
Developed scripts in Bash, Go, Python, and PowerShell for automation tasks.
Implemented continuous integration and continuous delivery processes for efficient deployment.
Monitored infrastructure using Prometheus and cloud provider tools for performance optimisation.
Maintained an effective cloud-based JIRA and Confluence instance to facilitate project management.
Safeguarded against security threats through endpoint monitoring and adherence to ISO27001 standards.
Big Sofa
10.2017 - 08.2018
Company Overview: An exciting company with technology in a rapidly expanding market place.
Assimilating their current infrastructure and deriving an automated way forward for multi country deployments that comply with PII and GDPR across whatever cloud/virtualization/hardware platforms suite.
An exciting company with technology in a rapidly expanding market place.
Technical Operations Administrator
The BMJ
09.2014 - 10.2017
Migration of externally hosted Windows/SQL stack into the existing infrastructure, to be followed by a second level migration from .NET to Java, Windows to Linux, and MsSQL to PostgreSQL.
Maintenance, monitoring, backup, and D.R of existing infrastructure.
Support of developers and users of infrastructure.
Yet another migration, this time the whole infrastructure, from all sources, puppeting and converting to SOA as processed, to a standard pattern across environments, removing any shared objects, with no downtime.
Linux Ubuntu 14.04 LTS and Windows Server 2012 servers.
MsSQL, PostgreSQL, MongoDB, xDB, MySQL, IIS, Apache, Tomcat, Elastic Search, Quova, Gradle, Jenkins, Ant, and much much more.
All servers fully puppeted, with web servers getting content from Jenkins and Database servers getting content from AWS S3 backups and WAL files.
So creating all the profiles to create an auto fire up/tear down environment that can be implemented in an AWS environment for development.
Also allowing a complete infrastructure re-creation anywhere the puppet server is instantiated for D.R.
Puppet code, enc, and hieradata all in GitHub.
PostgreSQL auto failover mirroring, and WAL archiving to AWS S3.
DNS management in AWS Route 53 via Jenkins.
With Prod and the majority of Stage migrated to The BMJ’s new private cloud provider, I’m migrating the rest of stage and all dev into AWS.
This uses VPN, VPC, Subnets, Security Groups, Direct Connect, SQS, SES, STS, EC2, RDS, and S3.
I’m also establishing an AWS portal using one account to handle IAM security and then using roles to grant controlled access to the other AWS accounts, being dev, stage, and prod for different billing models.
Currently I’m using perl with the AWS CLI to dynamically bring up base Ubuntu and Windows AMIs and puppeting to usability with a call to Jenkins to populate.
This is being done so as to create a simple mapping into Terraform for each destination VM, allowing dynamic tear down and bring up, then we can group servers to provide a 'service' as the object to tear down and bring up.
One of the products is the med school selector app, which is a gradle package being built with a Jenkins pipeline through dev, bdd, and stage.
The majority of all the other apps are built, tested, and deployed to dev on GitHub checkin, with a manual push to stage for UAT, then release tag and push to live when calendered for production release.
Also a couple of perl scripts to power down and up EC2’s and RDS’s outside working hours to reduce costs.
Everything is monitored with mon, munin, AppDynamics, and the hosting providers packages, all backed up to s3, and logs centralised in logstash.
The next step is terraform and consul.
Terraform to bring up 'services' (all the objects to run a particular product, e.g. china-in-a-box).
Consul to acquire all server info and specifically to allow dynamic dns creation on Auto Scaling servers.
Web Hosting Engineer
Emperor Design
10.2009 - 08.2014
Deployment, testing, debugging, backup cycle of client websites.
Responsible for companies virtual farm and infrastructure, and client servers.
Maintenance of dev, stage, QA, and live environments.
Consistency of code within subversion and across all platforms.
Implementation of automated tools to centralise current server website configurations and IIS logs for Urchin.
NAnt service check to ensure sites are up and running.
Bespoke backup process to schedule and centralise website file and database full and incremental backups.
Bespoke restore process to allow scheduled 'replication' of servers.
Installation, configuration and maintenance of Nagios Core to monitor the virtual farm.
Technical support assistance for Client Support providing technical triage and solutions.
Assistance to Developers in setting up working environments for specific projects and brainstorming solutions to individual specific problems.
SSO project for a client intranet.
Debugging, fixing functionality bugs in ASP.NET code.
Resolving 100% CPU Application Pool problems, Memory leaks, SQL bottlenecks.
Migration of entire virtual farm from company owned hardware into a Virtual Data Centre, requiring build of Windows and Linux servers, installation and management of AD, DNS, GPO, Mirrored SQL, migration of existing HTML, ASP, ASP.NET, emperor CMS, Umbraco, and EPiServer websites from Windows Server 2000, 2003, and 2008 to Windows Server 2008 R2 servers, and monitoring and proxying with Nagios, purpose built bash scripts, and haproxy on linux.
The entire process took 9 consecutive 7 day weeks, and caused negligible to no down time for all clients, as any opportunity to do a process either in parallel or with instant cutover was used.
Plan Personnel
06.2009 - 10.2009
Temporary work.
Web Hosting Engineer/Release Engineer
TMP Worldwide (UK) Ltd
01.2009 - 06.2009
Responsible for all processes outside design and development of websites.
Support of existing processes, hardware, and sites, with sites coming in from external hosts and being exported to external hosts, and migration of existing ASP based sites to the new server pair, and creation of new process which fell into two streams of work, one for simple ASP sites stored in Perforce who’s deployment process needed replacing to support the existing and new environments, and the other for all other sites under the .net framework both from Perforce and TFS to the new environment.
I replaced the existing system for ASP sites under Perforce using Active Perl with a totally new deployment process using Nant and OpenSSH.
This meant discovering what the existing process did, what the destination environments were, then creating a replacement process to deal with the existing sites, to be as similar in usage to the existing process as possible, to provide new functionality, and to include the full deployment process in one tool, including the addition of automating the IIS setup.
It also had additional functionality included for initial holding and closing pages on new or expired sites, and uninstallation of sites including removal from IIS again from the same tool.
This was implemented in two and a half to three months on both the existing windows 2000, and 2003 servers with IIS6 and the new load balanced windows 2003 live servers with IIS6, and has been running successfully ever since.
The solution was also implemented to support sites held in TFS.
The second area was subdivided into .net solutions within Perforce, which the above process covered, and new solutions in TFS, including sites with the newly acquired CMS EPiServer being deployed to the new windows server 2008 IIS7 environments.
Two virtual machine servers each with an SQL server and a windows 2008 IIS7 server were created, the second of which I built, to support the development and test environment, the new live environment was a load balanced pair of windows 2008 IIS7 servers with a shared IIS config, shared fileshare, and shared website source.
The TFS MSBuild process was extended to deploy the TFS built websites to each of these environments.
The new CMS EPiServer was also finally included in the automated deploy process after finding out how to set it up in a shared load balanced environment.
The existing CMS Sitefinity sites have also been included in the automated process.
Again the above NAnt process would support all of these sites but so far did not support IIS7, so an interim automation of the IIS7 setup was initiated.
Having established the implementation of each type of websites deploy process the creation of that ‘team build setup’ was then to be automated.
Build and Environment Manager
Globet International Sports Betting
05.2007 - 11.2008
Responsible for developers environment and support tools related to source control management, continuous integration, and deployment.
Initially there was no automation, source control was a mess, and a more robust Source Control provider was required, so the relevant source code was extracted from VSS and imported to Perforce, then automatic label creation for versioning was implemented to include all elements required to get a clean build of each product.
Using the label of a known version of code Cruise Control, with the addition of NCover for unit testing and code coverage, and FXCop for coding standards was implemented for Continuous Integration.
The next step was an automated deploy process for deployments of know tested versions of each product with it’s relevant language files to various 'environments', being localhost, development, test, and live.
Using OpenSSH with SCP, CACLS, and other command line utilities a completely automated deploy of ASP, and ASP.NET Web Solutions controlled by one central xml file to 'virgin' windows 2003 servers was achieved, maintaining information as to what was deployed where.
Because of the labelling, requirements of PCI/DSS were easily provided, also the means to genuinely roll back to a known version of code if any bugs were detected was covered, and totally restorative uninstalls implied.
App.config and Web.config files were parameterised so as to be holding only one skeleton file for deploying to any environment, this was later extended to include any file that needed tailoring for environments.
As Perforce was not entirely seamless within Visual Studio the code was again migrated but this time into Team Foundation Suite and the entire automation project was modified to support this as the Source Control Provider.
PC Support Technician
Digital Excellence
01.2006 - 04.2007
Installing and tailoring Windows XP, it's utilities, and end user applications.
Providing consultation with Office (Access, Excel, Front Page, Outlook, Power Point, and Word), HTML, Visual Studio 8.0, XML and XSLT.
Bespoke applications using WinForms, ASP.NET, C# .NET 2.0 under Visual Studio 2005, and SQL to provide small database manipulation packages, with summary output in XML for XSLT conversion into Word.
For ease of tailoring used UML Use cases to present to client.
Operator, Developer, Development Lead, Sys Admin, Ops Analyst, Data Processing Manager, Technical Lead
Printronic
South Wimbledon, Merton
01.1980 - 12.2005
The company had been using the mainframe to do all it’s processing with home grown programs running under VM/VSE written in 390 assembler running since 1980.
The decision was taken to re-write all the software in a portable object oriented language to provide flexible maintenance and future development possibilities.
Write/test/Unit Test/Release Test the following using C# under Microsoft Visual Studios .NET 1.1 2003, Source Safe, and NUnit code along side as test driven development.
As projects were large, create UML use cases for user interaction definition, existing parameters, and inter object communication.
Staged data to be supported as either resident in SQL or external home grown structure.
Duplicate Analysis – Process de-duplicated input as dupe groups, maintaining counts by list of cross matches and produce output reports in XML with XSLT to reformat.
Dupe Group Processing – Process de-duplicated input as dupe groups, allowing modification of the master of the group, the contents of the group, and updating slave information into the master.
Support external routine calls to provide the Priority, Comparison, and Update functions.
Record modification/selection – Process any input to one or more outputs modifying or creating fields with table lookups, field scanning, copying, and constant insertion, selecting records on field contents, table lookups, or nth naming.
Output formatting – Process any input to a customer defined output format.
Possibly applying mixed casing, salutation, endearment, input selectable variable text, and outputting in fixed, C.S.V. or proprietary format.
Sort – Process one or more inputs into a required sequence by sorting, or merging where the inputs are already in the desired sequence, possibly summing records with matching sort keys by record or into a sum field.
Support input and output modification of the records with external routine calls, with an eye on Mail Sort support.
75 external routines or 'exits' were also converted from 390 assembler into C#.
Supplement conversion from mainframe to C# of campaign data processing.
Using C# .NET 1.1 under visual studio 2003, analyse the job flow, condense the processing into something more object orientated, create new processing flow that uses recently re written applications and bespoke C# routines and classes, again to support staged data as either resident in SQL or external home grown structure.
Write a date conversion/validation 'exit' for VM/VSE in 390 assembler.
Provide a callable 'exit' for applications running in VM/VSE to produce well formed XML and integrate the 'exit' into all existing applications to output their counts.
Provide support for VM/VSE and it’s users applications.
Maintained, modularised, standardised, and upgraded the in house assembler applications.
RPGII report for matrix analysis of list rental de-duplication.
Rewrote the letter printing program, handled proportional fonts.
Created APA support without PMF or PSF.
Drove STO APA printers with PIOCS.
Wrote programs with VSAM interfaces to build tables from the new PAF data, including large users in the same format as small.
Updated in house software to access new PAF data.
Take projects and subdivide them into units requiring various levels of skill/knowledge and allocate these tasks to relevant members of staff.
Manage the progress of the tasks.
Technical consultant to evaluate the feasibility of connectivity and supporting equipment to it's fullest and most versatile abilities.
Initial research for upgrade to VM/ESA with VSE/ESA.
Live support and front end user interface coding on PS/2's with OS/2.
24 hour call as technical support and trouble shooter for any aspect of work done on the mainframe.
Produced a 'foreign duplicate elimination' suit of support routines.
Added post coding logic to the data standardising program.
Wrote a dumping program to handle different formats of supplied data, and machine specific code (e.g. 6 bit ICL, 6 and 9 bit Univac SDF) similar to DITTO.
Coded centralisation of text and word justification/wrapping within specified margins in the letter printing program.
Wrote a generic program to print duplicate groups clearly for checking duplicate elimination runs.
Wrote CA-SORT exits for rebate sortation, prior to the invention of Mail Sort.
Wrote Macro-4 Logout exits.
Took over full support and upgrading of the letter printing program.
Added support for STO's 6100 laser printers, including the high performance mode.
Wrote a program to scale XEROX fonts down from 300 dpi to 240 dpi for the STO 6100's.
Wrote a link and go selection program.
Established JCL standards with support routines and software.
Site planning and synchronisation of OEM's for the relocation of the entire mainframe and company.
Install VM/SP with DOS/VSE as a guest operating system.
Install CA-DYNAM/T.
Produce a suit of REXX/XEDIT macros to aid in supporting fonts across all machines.
Documented all my accumulated knowledge.
Education
½ year - Electronic engineering with computer science
University College London
London
09.1977 - 07.1979
1 year - Statistics with computer science
University College London
London
09.1977 - 07.1979
CSE - Computer Studies
Westminster City School
London
09.1971 - 07.1977
O Level - Mathematics, Biology, Chemistry, English Language, English Literature, Physics
Westminster City School
London
09.1971 - 07.1977
A Level - Mathematics, Further Mathematics, Physics
Westminster City School
London
09.1971 - 07.1977
Skills
Team collaboration
Leadership skills
Effective communication
Programming languages
C# and ASPNET
Java and JavaScript
Bash and Perl
Python and Ruby
Legacy languages (COBOL, REXX)
Assembler languages (Z80, 6502, BAL)
Database management (MySQL, MsSQL, PGSQL)
MongoDB and XML
Web technologies (HTML, XSLT)
Object-oriented programming
Project management methodologies (Prince2, UML)
Version control systems (GitHub, GitLab, Azure DevOps, Subversion)