Corporate Affiliations (links to details):
Years of Experience: 28
Fifth Third Bank, 2002–2007
Professional Data Resources, 1992–1999
BCD Technology, 1991–1992
BDM International, 1984–1991
California Institute of Technology, 1981-1984
Software Engineering: UML, Patterns, Booch, Rumbaugh, Gane & Sarson, Yourdon, Chen, Ward & Mellor
Operating Environments: Linux, Unix (AIX, Solaris), J2EE (Tomcat,WebSphere, WebLogic), EJB, JMS, servlets, JSP, Apache, AVR/Arduino, Windows, MFC, ATL, STL, COM, CORBA, Client/Server (TCP, RPC, iTRAN, Tuxedo), VAX/VMS, MS-DOS, MVS
Programming Languages: Java, C++, Perl, XSLT, XSL:FO, YACC, Lex, C, Prokappa, FORTRAN, SLAM II, Cobol, Pascal, DCL, JCL, BASIC, Prolog, VAX assembler, Lisp, Forth
Text Formats: XML, SGML, DSSSL, HTML, RTF, TeX
Database: Oracle, DB2, MS SQL Server 6.5, Sybase System 10/11, Rdb/VMS, Ingres, Paradox, Datacom/DB
Software Cost Estimation: Function Points, SPQR/20, CheckPoint, COCOMO
System Administration: Solaris, Linux, Windows NT/Back Office, VAX/VMS, Unix, Rdb/VMS, Ingres
Operations Research: Inventory management, readiness assessment and optimization, manpower assessment, shop floor capability analysis
My current assignment at Amazon.com is on the caching team, where I am developing a predictive refresh mechanism to issue asynchronous calls to backend services to refresh cache entries before they expire. I also worked on a distributed database system supporting Amazon's open source Carbonado API that replicates data through clusters of hundreds of hosts. These are critical systems, and I am responsible for design, implementation, and testing. Because of the system criticality, this includes both standard unit and integration tests, but also onepod tests and A/B tests assessing performance gains with production traffic.
Previously at Amazon, I was the lead developer on the item metadata system. This new system controls the processing of item data as received from Amazon retail and merchants and published to the retail website and the A9 search engine. It supported dynamic user interface generation both on the website in applications such as Seller Central and Eclipse-based GUI clients using the Eclipse Modeling Framework. As lead developer, I was responsible for working with system users to develop requirements, system design, implementation of core algorithms, and mentoring junior teammates.
Fifth Third Bank (2002–2007)
My last assignment at Fifth Third Bank was systems officer leading the Java Server Development team in the eBusiness Architecture group. I led a team of seven engineers in developing the Host Integration middleware tier for the Bank's online systems. This component uses stateless session Enterprise JavaBeans (EJBs) to allow client applications to connect to twenty-five different backend systems using a variety of technologies, including Java Database Connectivity (JDBC), Java Messaging Services (JMS), screen scraping, and custom socket interfaces. It is used by a variety of systems, including the internet banking, customer service, and voice response applications. The code is deployed both embedded in J2EE applications and as a stand-alone web service for use by SOAP clients. As technical lead, I am responsible for maintaining the release schedule, assigning and approving change requests and tasks, running code reviews, and performing code deliveries in addition to developing several of the interfaces.
As part of this assignment, I led the initial deployment of the Rational Unified Change Management process at the Bank. I worked closely with the ClearCase and ClearQuest deployment team to define process workflow and the roles and responsibilities for the software development process. I also helped with the ClearQuest database schema definition and the requirements for each of the workflow approval gates.
Through the combination of clear release package definition, a formal testing procedure, and collaborative code reviews, we were able to reduce the rate of delivered defects in the Host Integration software to one or less per release. The concrete result of this improvement has been increased trust by the other software development groups in the Bank and willingness to accept releases and patches with confidence that they will not introduce new and unexpected bugs.
Previous assignments at the bank include development of a custom tag library for integration with FileNet, integration services between mainframe applications and the Remedy system, and analysis of failure modes for common components throughout the bank's Information Technology infrastructure.
My final assignment at Digineer was lead engineer on the OC5 web-based application to drive online marketing campaigns and training. I led a team of seven engineers and a graphics artist in developing OC5, working with a project manager and a QA team. The application was developed in Java, using servlets, JavaServer pages (JSPs), and EJBs running on the WebLogic J2EE application server. It uses custom tag libraries and JSP templates to dynamically generate web sites from customer-provided content and a data-driven site map and state machine. The client can monitor the progress of their campaign in real time through the companion OC5track application, which generates dynamic charts showing various metrics such as presentations over time, targeted vs. untargeted users, etc.
OC5 also includes a set of tools for site creation and administration. The first is a site layout tool, which generates a site map in XML that is uploaded into the Oracle database, and an administrative mode, where the web pages in a customer site are converted through the custom tag libraries to provide a WYSIWYG environment for editing the site content. There are also reports and back-end maintence operations accessed through an administration web page. These include output data interfaces for clients generated via Oracle XSQL queries, reports that use the same XSQL query mechanism along with the Apache Xalan XSLT and FOP processors for output formatting, and an XML-over-HTTP output interface to the payment processor.
Previously at Digineer, I was lead engineer on a web-based set of applications to provide various tools to physicians. I led a team of five engineers, a tech writer, and a graphics artist in developing these applications, working with a project manager and a QA team. We developed the applications in Java, using servlets, JSPs, and EJBs running on the WebLogic J2EE application server. Previously, I was lead engineer on a number of similar projects where we developed requirements and a design for similar portal sites aimed at doctors. I was also lead engineer on a five-person team that developed the high-level architecture for vertical portal implementation and evaluated products for the major blocks and worked on the detail design of an actual implementation.
Professional Data Resources (1992–1999)
My final assignment at PDR was at Convergys, developing a CORBA server in C++ using the Catalysis methodology to provide a catalog of products and services for telephony and related industries. Previously, I was at Component Software International during the summer of 1999, where I was the principal developer for a five-person team rewriting an OS/2-Smalltalk based operating room scheduling system in C++ for Windows/NT for Y2K compliance. I developed the scheduling system using MFC with heavy reliance on STL and used Sybase System 11 as its database. Fewer than five bugs were encountered in the system during beta test.
From 1997 to 1999, I was assigned to Lexis-Nexis, where I developed COM components in C++ using ATL for parsing, manipulating, and formatting SGML and XML documents on a major new web-based client/server project. I also developed NFA and DFA regular expression matching engines using STL for use in parsers, an SP-based SGML to XML translator COM object and DTD’s to support its use, and a HTML to XML translator COM object. I also developed a COM object wrapper for the Jade DSSSL engine, enhanced the engine to support additional RTF and TeX constructs, and developed DSSSL style sheets for formatting and printing XML documents using RTF and TeX. I ported NFA and DFA regular expression matching engine and XML parsing and manipulation objects to Unix and worked with the Unix print team to integrate the Windows NT and Unix-based print servers and to develop TeX macro support for the DSSSL engine output.
Previously at Lexis-Nexis, I was responsible for implementing and debugging interpreters for scripts embedded in documents using C++, Lex, and YACC. I also assisted in debugging the C++ Win16 client application for the Research Manager client/server product, tested the product under Bounds Checker and the debug kernel for API errors, resource leaks and allocation problems, and repaired these. Also, I analyzed the product for Y2K compliance and implemented patches to date software and headed redesign of document browser and presentation module. In additional to my work at Lexis-Nexus, I was responsible for administering the PDR office LAN and Back Office software.
From 1994 to 1997, I was assigned to Reynolds & Reynolds, where I was responsible as the technical lead for the implementation architecture and development of C++/MFC client/server applications under Win16 connected via ODBC and CT-lib to a Sybase database server. I led the effort to convert the product to the 32-bit environment, developed the core architecture components, developed the C++ objects to adapt the MFC document paradigm to a persistent object model, and developed the object-to-relational persistence layer that stores the objects in a Sybase database. Also, I developed several GUI controls and was lead for the Contact Management and Literature Fulfillment subsystems.
From 1992 to 1994, I was assigned to Cincinnati Bell Information Systems, where I developed application client/server software in C++ and Prokappa/C in a client/server environment. I served as DBA and data access software developer for the bill creation product, which operated on both the Sun/Oracle and MVS/DB/2 platforms. In addition, I developed a code generator that created the persistent object framework and object-to-relational persistence layer from an object model specification. Previously, I was DBA and domain object developer for version 2 of the Common User Access product, which ran on Sun/Motif and MS Windows clients with Sun/Oracle servers. Earlier, I led the team developing data integrity rules for the FOCIS project and developed the tools and methodology for their implementation. Other assignments at CBIS included the design and development of the system administration file maintenance and reporting software and database analysis.
BCD Technology (1991–1992)
At BCD, my principal assignment was the Automated Train and Crew Dispatching System, a distributed database system with a fault-tolerant VAX processor using Rdb/VMS networked to SCO Unix workstations with local Ingres databases. I was the DBA for the development effort and designed the communications/distributed update system.
BDM International (1984–1991)
At the BDM Dayton office from 1989 to mid-1991, my major assignments all involved computer-aided software engineering (CASE) technology using the Excelerator package. I performed a detailed analysis of the software development documentation life cycle at BDM, and developed a data model and software suite to support it. The suite included a 4GL report formatting package, cross-referencing tools, mainframe interfaces, and document templates.
At the BDM Los Angeles office from mid-1987 through 1988, my last assignment was development of a model for the coupling of laser energy into targets as part of the Shock Response in Advanced Materials (SRAM) project under Dr. Marshall Sparks. Earlier, I designed and implemented several PC-based availability/readiness models for a classified customer evaluating dormant reliability, failure detection, and shop floor analysis, and a generator that converted C-17 reliability and maintainability data into flow networks for the Logistics Composite Model (LCOM).
At the BDM Dayton office from 1986 through mid-1987, my last assignment was the planning, budgeting, and implementation of the CASE center. I also produced software development cost estimates for the RDB project using COCOMO, Function Points, and SQPR/20. Previously, I designed the implementation of the Aircraft Availability Model (AAM) for the Air Force Logistics Command Requirements Data Bank (RDB) project and prototyped various key algorithms in FORTRAN under MVS. Prior to that assignment, I developed algorithms for use of the indenture structure and interchangeability and substitutability relationships in the RDB weapon systems availability and requirements computations.
In my initial assignments at BDM Dayton from late 1984 through 1985, I wrote the Functional Description (FD) for the Initial Requirements Determination segment of the RDB. Earlier, I evaluated the LMI AAM model, the Rand Dyna-METRIC model, and the Contel WARS model for possible implementation in the RDB.
California Institute of Technology (1981–1984)
On my last assignment at the California Institute of Technology from 1981 to 1984, I developed a work study tracking system for the Office of Financial Aid. Previously, I was the principal investigator in the G0.0 experiment, which observed the Galactic Center at the 3.8 and 13 cm wavelengths via aperture synthesis using three major radio telescopes with the MkII very long baseline interferometry (VLBI) data acquisition system; and performed a variety of tasks for the L23 and L26 experiments, which observed the Galactic Center and NRAO 530 at 3.6 and 1.3 cm using the MkIII VLBI data acquisition system with six major radio telescopes. Earlier, I developed a set of device drivers for a device-independent graphics system under Unix and VAX/VMS.
References and salary history available upon email@example.com