Data Architect | Master Data Management | Big Data developer | Analyst | Data Engineer | Aspiring Data Scientist

Shivaram Challa Interactive Resume (challa.net/profile)

PROFILE

IBM CEO Rometty said it best, "Data is the new natural resource".

I specialize in developing data centric solutions to fulfill organizations information needs. Experienced in designing, implementing and maintaining system interfaces with synchronous and Asynchronous communication, data extraction, data profiling, data integration, data quality, Metadata analysis; while using the most suitable tool and/or technology for the job, including Hadoop platform (both Cloudera and HortonWorks), Direct Database access, Web services (WSDL), Message Queues, CSV/text files, web scrapping, programming API, FTP, XML, XSL, Custom programming using C#, ColdFusion (CFML), VBScript and Batch programming.

I produce outstanding results leveraging on my strong technical, analytical and problem solving skills to analyze, design, develop and support highly available, highly performing systems, while leveraging on user requirements analysis, database design skills, strong programming background, excellent oral and written communication and adroit understanding of business.

EDUCATION

Fairleigh Dickinson University, Teaneck, NJ, USA

Computer Engineering, Master of Science (Graduated 2005)

Studies focused on Software engineering, IT project management, Database application development, Web design, Technical Planning, Business Case Analysis, computer networks and other communication networks. Created working prototypes for Oracle Database applications, web site projects and desktop applications.

PROFESSIONAL ABSTRACT

  • Over 6 years of experience with Master data management (MDM)
  • Over 17 years of compound IT experience.
  • Over 12 years of experience Data Warehouse Architect.
  • Over 13 years of experience designing, developing & implementing ETL/ELT solutions.
  • 3 years of experience working with Hadoop (both Cloudera and Horton works) with HBase, Hive, HDFS, Sqoop, MapReduce programs (using MRJob in python) Ambari, Cloudera Hue, HBase Master and more.
  • Always looking to learn and understand new technologies and their relevance to the business
  • Experience with Agile/SCRUM and RAD/SCRUM development environment
  • Experience supporting mission critical and several other applications
  • Experience working in large OLAP environment with over 1.2PB data in Hadoop
  • Experience working in large environment with Mission critical OLTP database applications with over 1000 transactions/minute, 60+ SQL Servers and 150 developers
  • Progressive management experience, team leadership and understands the value and importance of delegating work and responsibilities
  • Excellent communication, presentation and interpersonal skills and ability to prioritize and coordinate work across different teams
  • Experience creating and maintaining system interfaces between Work management system, Customer Information System, Finance and Accounting system and HR system
  • Experience creating and maintaining interfaces using Web services, Message Queues (MQ), flat files, direct data access
  • Experience installing, configuring, managing, monitoring and troubleshooting SQL Server 2008/2005 and Oracle 11g/10g
  • Experience creating and maintaining Reporting infrastructure in Microsoft SQL Server Reporting Services, Crystal Reports, Oracle Reports, ASP.net and Classic ASP.
  • Expert in analyzing and implementing normalization/de-normalization needs.
  • Extensive experience with PL/SQL and T-SQL in creating Tables, Views, Indexes, Stored Procedures, user defined functions, Triggers, Cursors, Roles, Logins, Users, relational database models, self-sufficient data dictionaries and data integrity with constraints.
  • Hands on experience and knowledge of high availability SQL Server solutions, including Log shipping, Replication and SQL Server clustering.
  • Created and used the contingency plans for disaster recovery for the databases and operating systems.
  • Extensive experience with Monitoring, performance tuning and analyzing database performance and allocate server resources to achieve optimum DB performance.
  • Managed Data migration from Oracle, Teradata to SQL Server 2005 and Vice versa
  • Developed PL/SQL packages, shell scripts to execute the packages and PL/SQL blocks to debug the package in production environment and migrated bulk data.
  • Improved performance by using Explain Plan, creating appropriate indexes, queries optimization, utilizing table spaces and partitioning schemes.
  • Implemented flash back recovery points on tables with sensitive data.
  • Managed access to users on respective DB based on the hierarchy and individual requirements.
  • Excellent SQL code optimization skills using explain plans/execution plans.
  • Strong understanding of SQL server Internals via Dynamic Management Views (DMV).
  • Extensive experience in developing, deploying and monitoring SSIS packages, DTS Packages and DTS to SSIS conversion.
  • Experience in designing, implementing, maintaining, and documenting complex OLTP applications, and data warehouse environments. Extensive experience in design techniques, ETL processes, relational database theory, N-Tier development
  • Excellent report creation skills using Microsoft SQL Server Reporting Services (SSRS).

** References are available upon request **



Hogan Lovells US LLP

Washingtion, DC
Data Architect
Since Aug 2016
I am currently leading the Master data management effort at Hogan Lovells. I have implemented the current solution with a team of varying side af 4 to 14 - depending on the need at the time.

Business Environment:

Hogan Lovells is an American-British law firm co-headquartered in London and Washington, D.C.. It was formed on May 1, 2010 by the merger of the American law firm of Hogan & Hartson (founded 1904) and the British law firm Lovells (formed 1899). Hogan Lovells has around 2,800 lawyers working in more than 40 offices in the United States, Europe, Latin America, the Middle East, Africa and Asia. As of 2017, Hogan Lovells is the seventh largest law firm in the world by gross revenue.

Key Accomplishments at Hogan Lovells:

  • I am leading the effort to create a centralized data integration platform and master data management solution. So far we have successfully implemented the Master data management solution for People data, Client-Matter data, Vendor and Payee data.
  • Worked with Finance and HHDIR teams to redirect HHDIR to pull data from DIH instead of Elite enterprise and then from Elite 3e.
  • DIH team played a key role in implementation and testing for Single Finance System project.
  • Currently I am leading the effort to redirect the downstream systems from ODS and HHDIR to pull data from DIH platform. This is based on the agenda and timeline outlined in the DIH redirection roadmap.
  • Data integration Hub (DIH): As a technical lead on the platform, I have successfully lead the team and played a key role in implementing the following integrations: Workday to DIH, DIH to workday, 3E to DIH. This includes making the DIH platform available for direct DB access, REST and SOAP API’s and allow access to sysadmins via Excel.
  • Worked with members of finance team when implementing DIH to 3E people integration.
  • Worked closely with the vendor to implement DIH to AD and AD to DIH integration.
  • I worked as part of the internal + vendor team, that developed the EDW strategy and design.
  • Successfully implemented the Data extraction process from HoganLovells.com and DIH, combine and the data and send it to the vendor site of Global Alumni Portal.
  • I have developed several downstream integration from DIH.
  • Using the EntityListService and Firmographic web service from Dun & Bradstreet, I used Maestro to cleanse and de-duplicate Client data, so that the data can be fed into 3E (for Elite enterprise to Elite 3E migration).
  • We are the first teams to adapt Agile methodology. I have been playing the role of SCRUM master for the DIH project and lead all the sprint ceremonies. I did receive certification as “certified Scrum master” after training (at HL) and the certification exam.

Merkle Inc.

Columbia, MD
Senior Database Application Developer
April 2013 - July 2016

Business Environment:

Merkle is the nation's largest and fastest-growing performance marketing agency. Merkle helps brands transform their marketing organization using data, analytics, and technology to create meaningful, personalized customer experiences that deliver competitive advantage.

Certification, Recognition and Awards at Merkle:

Certified Merkle Digital Professional: I have attended 10 classes and passed 11 exams to be certified MDP

Merkle One Team Award – for Hadoop Migration work from Cloudera to Horton works

Merkle attribute award - Passionate – for helping another team for an oracle database related project

Merkle attribute award - Sense of Urgency – for diligently working on getting a client’s corrupt reference base back.

Projects and Responsibilities:

I was part of the team that was responsible for developing and maintaining an Enterprise wide reference base (Master Data Management) that hosts standardized, consistent and most current data for individual customers for each client�s reference base. This reference base becomes the central source of information based on which the marketing campaign attributions are done. Various other platforms use this as the source data.

Hadoop: Hadoop platform was used to create a Connected Recognition (CR) platform, which include Customer Data Integration (CDI) and Digital Data Integration (DDI). The data was stored in HBase tables, Hive tables and HDFS files. I used Ambari to check the system stability and data node availability. Used HBase Master to monitor proper data distribution to avoid hot spots in HBase tables. This environment handles over 1.2PB of data.

Connected Recognition (CR) - Job automation: I have implemented the automation of daily data ingestion and weekly report generation job for several clients. This includes developing several scripts to move files between Hadoop’s HDFS and SAN, created several Hive scripts and run the Map-Reduce job to process the data, creating jobs in the enterprise scheduling software (Cisco Tidal), and export the report data into SQL server 2013 for billing. This automation framework was eventually applied to all the other clients by using the codebase I developed. This has and will save a lot of time for all client onboarding’s. Involved Bash, Hive, Scoop, Java MR jobs and Python.

HBase table Performance optimization: The rows in HBase tables are sorted lexicographically using the row key. This allows for fast access (acts like a clustered index) by using the start row-key and stop row-key on each region which may or may not be serviced by separate region server. When too many records within a range of start keys (like HASH keys starting with letter S) are clustered together (this phenomenon is called hot-spotting), I split the region and put these two regions on separate region servers. Hot-spotting usually leads to job failures as the uneven write load on a single region puts undue pressure on respective region server, causing region server failure. I used HBase Master to detect the regions server with too many regions and regions that are highly clustered together.

Automated Hadoop MapReduce job execution and Hive scripts and query executions using Python 2.7. On a separate project, I used/tested Luigi to run multiple MapReduce jobs and Hive scripts (Luigi is a python module used to build complex pipeline for batch and jobs).

Automated part of the billing process by using Sqoop to transfer the MapReduce job output from HDFS to SQL server 2012

Legacy Key migration: When migrating Clients from legacy platform to Hadoop based Customer Data Integration (CDI) platform, we lose the ability to attribute a sale to a given marketing campaign. This project allows us to tie the legacy IDs to be mapped to the Current Platform, giving us the ability to attribute a sale to a marketing campaign. * Skills Used: Involved writing several Hive scripts and chaining them using Bash scripts. I created External Hive tables on top of HBase tables.

Migration: successfully migrated MasterSource reference base from legacy KL2 to CR-CDI platform. This is by far the biggest reference base of all the clients we manage on CDI platform.

HBase Snapshots: Successfully managed (created, Restored and dropped) HBase table snapshots in several instances to restore a corrupted reference base for individual clients

Connected Recognition - CDI - POC implementation: Proof of Concept database is a small scale database hosted on SQL Server 2012 database on a smaller scale of data (not exceeding 10 million records). Once the client team reviews the output data and approves the match rules, then the actual data is ingested on to the Hadoop Platform.

Tier 3 support: For connected recognition (CR) and Knowledge Link (KL), I am part of the team that provides 24/7 hours support on a rotating schedule basis. I was called several times late at night and I took I resolved the issues and brought the solution/process back online. Several times, I had to initiate SWAT calls, bringing everyone necessary and took the lead to successfully resolve the failures. * Skills used: SQL expertise, System integration knowledge, Knowledge on Hadoop stack (HBase, Hive, HDFS file manipulation, Sqoop, HBase Master, HBase hotspot detection, etc.), Cisco TIDAL – Enterprise scheduling software and leadership and managerial skills.

Data Analysis: Perform data analysis by extracting data from HBase, Hive, HDFS files, SQL server 2008/2012, Oracle 10g and Netezza Knowledge link 3 platform: I have developed and implemented several functionalities and improvements to the platform. Some efforts include feature development and implementation, bug fixes, etc. Involved Bash scripting

CR - Digital Data Integration: I worked on modifying the T-SQL code embedded in the java code to process input data. Paraccel Upgrade feasibility study: Paraccel (Actian Matrix) database is a parallel relational database system using a shared-nothing architecture and columnar orientation. I did the upgrade feasibly study; this involved, setting the environment on a newly built test cluster, import sample data, test connectivity from local and remote access. Ensure the SQL parsing engine would support the existing code base.

Hadoop environment conversion: I am part of the team that migrated the Hadoop environment from Cloudera to Horton Works. Played a role in testing the successful migration of Sqoop, HDFS and HBase usability. The conversion team received recognition and award in a company-wide quarterly meeting for successful migration.

On-boarded several clients onto CDI platform and KL35 platform: On-boarded T-Mobile, Sanofi, PayPal credit, ESPN, MetLife, Abercrombie, Cedar fair, Decker’s, Medical mutual, Regions bank, Sun power and more

Worked on KL35 to CDI migration effort: worked on migrating several clients like Oriental trading co, AAA, Dell, Carters, Chase home equity.

On-boarded clients to Standardized DDI solution: Geico, 1-800-flowers and Toyota

Documentation/technical manuals: Created several Standard operating procedure (SOP) documents that are used by the Production operations group for the management of CR-CDI and CR-DDI applications. Detailed instructions for failure recourse and escalation procedures are documented as well.

Skills & tools

Following skills and tools were used to deliver a relevant solution in various scenarios. Bash scripting, Hadoop (HDFS file manipulation, written several Hive scripts and ad-hoc HiveQL queries; created, dropped and queried HBase tables and Created, dropped and restored snapshots of HBase tables), SQL Server 2012, T-SQL, Oracle PL/SQL, Hue, Ambari, Korn shell scripting, Python 2.6, Java 7, Netezza.



Lockheed Martin IS & GS Civil, Energy services

Manassas, VA
Data Warehouse/ETL Architect and Admin
June 2009 - April 2013

Business Environment

Energy Services division of Lockheed Martin provides IT services to several Electric utilities. And Northern Virginia Electric Cooperative is one of them. NOVEC provides electricity to about 150,000 customers in Northern Virginia with highest customer satisfaction in USA for 2012 (by J.D. Power)

Projects and Responsibilities

EDW/BI Project: As the technical lead, I lead the team to Design, Develop, Implement, Test and Support EDW/BI solution over multiple phases and be the subject matter expert on the team. The responsibilities included working with the end-users for Requirements Analysis, Technical specification development, Data Integration, Gap Analysis, Data Mapping, Data Profiling, Data Cleansing, validation and the overall solution architecture. I developed ETL (Extract, Transform & Load) Architecture, self-contained Meta Data Management (MDM) system within the oracle database, Performance Monitoring system for analysis and tuning, Role based security model for providing modular access to the Enterprise Data Warehouse. I provided the hardware specification for the Database server and ETL Application server; for Production and test environments.

As the primary application admin for Logica ARM WMIS (Work Management Information System), I installed, upgraded and maintained the Server infrastructure, which includes Oracle 11g database, Oracle Application Server, Oracle Advanced Queues, IBM WebSphere MQ and ARM IMFPlus Web Services. Maintain, upgrade and troubleshoot the interfaces between WMIS, GIS (Geographic Information System), Lawson (Financial & HR), Accounting (Power plant), Outage management and Crew Dispatch system (Clevest Mobile Field Force - MFF). I reviewed, installed, upgraded and patched the database, application and coordinated the fat client deployment on to over 200 PCs.

SQL Server2005/2008 DBA: I have managed 14 SQL Servers, ensured the stability of the servers, planning for disaster recovery, monitoring for performance issues and tuning the database for optimal performance. With the server team, I planned, provided the hardware specifications and deployed the SQL Server software on Windows2008 enterprise server for Mobile Field Force management software.

Oracle 9i/10g/11g: As part of a team of three DBA's, I managed the various versions of Oracle databases supporting the enterprise applications such as Enterprise Data Warehouse, Work Management Information System, Geographic Information System, power plan (Accounting software)

Consolidated the Reporting infrastructure providing the end users one access point to all the organizations reports. The Reporting platforms consolidated include Crystal reports, Oracle Reports, ASP.net/ASP and Cold fusion based custom reports. I also provided access to some of the reports from within the client application.

Consulted as the subject matter expert for databases and System Interfaces and system integration for several projects along with the server team. The projects involved understanding the business challenges and working across business groups and technology teams to ensure alignment between business solution definition and system architecture for the organization.

Skills & tools

Skills and tools used to deliver a relevant solution in various scenarios. Knowledge and Experience in System Integration, data integration, Data warehouse and ETL design and architecture, dimensional modeling, OLAP, OLTP, RDBMS, Normal forms, Requirements analysis, source system profiling, SDLC and DBA experience. SAS DataFlux Data Management server and studio, Toad for Oracle, PL/SQL Microsoft SQL Server 2005/2008, T-SQL, crystal reports, Oracle Reports, ASP.net, IBM Web sphere MQ, IMF Plus MQ and Oracle MQ based system interfaces. XML, XSLT, XHTML 1.1, JavaScript, MS Visual studio .net for web and application development using C# programming language, working knowledge of IBM DB2, Source System specific and ODBC Data connections, MS Visio and Office suite.

Sprint Nextel

Reston, VA
DBA & SME in DB Design, T-SQL, SSIS & SSRS
November 2007 – May 2009

Business Environment:

Customer Device Support Data Team is part of the Decision support system in Retail organization. The data/reports provided by the team lead to the effective decisions that involved in raising the customer satisfaction to 72% (in 2008/2009), making Sprint the second highest in CSAT’s of all the telecoms industry. We are a Rapid Application Development (RAD-SCRUM) environment providing that ‘one stop shop’ in Sprint Retail organization to get any data needed for executive decision making up to the CEO level.

Responsibilities

Managing the Device Support Data Team’s consolidated Data warehouse Application, which is part of the Sprint Nextel’s decision support system. My role was to establish an enterprise level database environment with 98% uptime and data availability at an agreed upon SLA with 20,000 users from coast to coast. This includes upgrading and maintaining 2 MS SQL Server 2000 to SQL2005 (64-bit & 32-bit), Overhaul of the then existing security infrastructure, automation of several data import and exports(SSIS), set up web reporting server (SSRS), setup SFTP server and optimally maintaining data that is close to .75 TB. Being the publicly traded, Fortune 500 Company that Sprint is, I’ve developed data access policies and procedures that strictly adhere to SOX compliance. Expertise in SQL coding and SSIS design skills were of great use here.

Representative Projects and Accomplishments

Metadata project: Purpose of this project is to create a holistic view of the how data entered, processed and converted to output format in our system. Involved in every step of Software Development Life Cycle (SDLC). Took a novel approach for creating a self sufficient system of obtaining and maintaining custom metadata for all the user objects as Extended Properties. Created nightly agent job that executes Procedures to refresh tables that holds inter object dependency data, like job to procedure dependency data. Skills Used: Database design and Architecture concepts. Understanding of MS SQL 2005 system objects/information schema.

Picked up the Total Equipment Protection (TEP/ESRP) analysis project midway and consolidated and optimized the process. Added new code to address changing business needs. Automated this process to self sufficiently update the data. This effort received accolades from the executive management. Skills Used: T-SQL expertise, Business acumen, customer communication and worked with tight deadlines.

Created a web report that shows the system wide data load status, tests all the linked servers and lists the failed jobs. This enables the team to quickly trouble shoot if there are any data load issues. Skills Used: Understanding of SQL Server Internals, SQL Server Reporting Services (SSRS) and T SQL.

Created a reporting environment with SQL Server Reporting Services, this replaced an existing Crystal reports system. Provided training to developers in creating and deploying reports in SSRS.

Automated several data imports and exports and converted Several Data Transformation Services (DTS) packages to SQL Server Integration Services (SSIS) packages. Data sources include Oracle, Teradata, MS SQL server, MS Access, Spreadsheets, Flat files (CSV,TSV) and pulling and pushing multiple data files from and to FTP & SFTP sites. Also, used VBScript and Batch jobs for some of the automation. Skills Used: SSIS, T-SQL, PL/SQL, SQL for Teradata (TSQL), VBScript, Batch job (Windows Shell scripting).

Optimized a data load from Oracle that used to take over 13 hours (using ODBC connection), to a load that takes less than One hour using Integration Services.

Installed and maintained Oracle database servers on a Windows and UNIX machines.

Maintained 98% server uptime without any data loss what so ever. As the first project, converted an under-the-desk server to an enterprise class server with automated maintenance, with disaster recovery following industry standard best practices. . Also, response times were improved over 50% (1) by separating data file and log file onto separate disks. Further improvement was achieved by employing database partitioning scheme. Skills Used: Efficient SQL Server configuration, Optimized database design, efficient partitioning strategies and effective maintenance.

Upgraded to Windows 2003 server, SQL server 2005 64-bit Enterprise edition, On a Machine with 8 core processors, with 8GB RAM (to be upgraded to 24GB). Installed SQL Server 2005 32-bit on a different machine. Installed and Maintained IIS6.0, IIS 7.0, Secure FTP server (SSH), SSRS, File Share and Server communication using Database Mail and installed Used Visual Studio 2005, SQL Server Business Intelligence Development Studio (BIDS), Management Studio (SSMS), Microsoft Security Baseline analyser, Database Engine Tuning Advisor, SQL Server Profiler, Console job scripter, Teradata SQL Assistant 7.2, Oracle SQL Developer, SQLDBX, Active Perl 5.10.0 Build 1002.

Successful Disaster Recovery: Followed contingency plan created earlier for situations like this. Moved databases to a different server, reinstalled the Windows server 2003 OS and SQL Server 2005 64-bit enterprise edition, and recovered the databases without any data loss.

Security overhaul of the database access, implemented through Role based view only data access. Skills Used: Understanding of SQL Server 2005’s Role based security model and system security tables.

Created several procedures using Dynamic Management Views (DMV), to automate various administrative tasks. Following are a few of them (I published some of these at SQLServerCentral.com). Skills Used: SQL Server 2005, knowledge of SQL Server Internals via DMV’s, T-SQL.

  • Index fragmentation: Automatically picks up the most fragmented index and re-indexes or Re-organizes the index. This takes less time and touches only the necessary indexes.
  • Missing indexes: Provides a list of missing indexes in the system.
  • Unused indexes: The unused indexes can be dropped, thereby reducing the data load times and meeting the Service Level Agreement (SLA).
  • Provided developer with 2 Procedures that would provide to-the-second list of currently running processes (that are active/running) and currently locked objects. This helps to identify and counter the dead locks and resource contention issues that occur on the server. Identifying the locked objects help in understanding the way the system is accessed by the user base.

Automatically Deny SQL Connect to users during a certain time to meet a business need. Before this is triggered, all the users’ sessions are to be killed forcibly with some exceptions. This is implemented using sys.sysprocesses and cursors.



InPhonic/Simplexity

Reston, VA
Database Administrator, T-SQL, SSIS & SSRS
July 2005 – October 2007

Responsibilities

Manage enterprise database environment of 60 SQL Server systems, hosting multiple tera -bytes of databases. The largest individual database size exceeds 400GB. Database systems included 5 active/passive SQL Server clusters on both SQL Server 2000 and SQL Server 2005. Internet storefront, Back Office, billing and fulfillment systems database needs provided entirely through Microsoft SQL Server.

Business Environment

InPhonic is the leading online seller of wireless services and devices. It has developed its own e-commerce platform that integrates merchandising, provisioning, procurement, customer care and billing operations into a state-of-the-art system of network applications. The constant changes in the mobile phone industry have required this environment to be very dynamic, constantly adapting to this ever-changing business. Representative Projects and Accomplishments

Programmatically identify schema elements that contain sensitive customer data and developed automated script to obfuscate same in development and test environments. It is automated to search through all the databases on a server. This addressed a SOX compliance issue in the environment. Skills used: Database design, Transact-SQL, administration of MSSQL 2000 and MSSQL 2005.

Develop .Net application using SQL Server Management Objects (SMO) to generate T-SQL “create” scripts for SQL Agent Jobs from all servers to a central file share. This improves the resilience of the environment, by documenting the hundreds of SQL Agent Jobs across all servers. It also provides configuration management/tracking of the SQL Agent Jobs. This project fills a gap by supplementing the Red-Gate SQL Compare daily “snapshots” of database objects, assisting documentation of database changes in this dynamic environment. Skills used: MS SQL Server 2000 and MS SQL Server 2005, Visual Basic.net, SMO.

Executive support (elucidates business-analysis requirements, translate them into usable technical requirements, then implement same through SQL Server Reporting Services (SSRS). The resulting reports of Key Performance Indicators (KPI) are routinely used for executive decisions. Examples include whether to expand or reduce customer call center presence, to make informed sales and purchasing decisions, to measure performance of managed departments. The business intelligence provided substantial cost savings to the business. The metrics in some reports are used to report the quarterly financial earnings. Provide ad-hoc business metrics to Call Center Operations management team on call center and call center staff performance. Skills used: Requirements analysis, Verbal and written communications, business acumen, Microsoft SSRS, TSQL query optimization, software development.

Create and maintain numerous, complex DTS and SSIS packages: Coordinate and manage third party communication of business data (e.g. import/export), using XML, flat files, FTP and other technologies. These designs and implementations required extensive knowledge of Data Transformation Services (SQL 2000) and SQL Server Integration Services (SQL 2005) Skills used: Extraction, Transfer and Loading (ETL), business acumen, requirements analysis.

Provide guidance (and informal training) to development teams (150+ developers and 3 DBAs) regarding database design issues such as normalization, query optimization, index design, trigger and stored procedure design. Assist developers in troubleshooting unexpected query results, poor performing queries, and a wide variety of analysis and configuration issues. I was involved in the entire Software development life cycle (SDLC). Skills used: Software development experience, Verbal and written communications, Database design theory, SQL Server performance analysis (e.g. Profiler and Execution Plans), query optimization.

Active/passive cluster installation, upgrade and maintenance for SQL Server 2000 and 2005. Install A/P clusters (Microsoft Cluster Server) and migrate user databases from single-node systems to the cluster. Perform OS and RDBMS service pack and patch maintenance. Skills used: MS SQL Server 2000, SQL Server 2005, Windows 2003 Server, Microsoft Cluster Administrator.

Implement and manage SQL Server Replication and Log shipping to augment disaster recovery contingencies and minimize potential data loss. Skills used: Microsoft Windows 2003 Server and Windows 2000 server, SQL Server 2k and 2k5, SQL Server DBA skills.

Administrative responsibilities include but are not limited to configuring scheduled backups, as-needed restores, monitoring of scheduled tasks and SQL Agent Jobs, responding to alerts generated by COTS monitoring solution, scheduled database performance tuning, etc. Manage database security and permissions from system level through individual database objects. Ad hoc audits of users accessing the database. Respond internal audit requests from SOX and Security teams. Participate with Change Control Review Board and propagate approved changes to the database servers. Creating logical and physical data model diagrams. Optimizing the data access by Performance Tuning, Indexing, and Query Optimization. Skills used: problem solving skills, communication, Prioritization, scheduled task management, SQL server Administrative skill, DBCC commands, knowledge of System objects, etc.

Actively involved in On-Call Duty sharing with DBA team, where I am needed to respond to all the alerts we get from the Notification team and several other monitoring tools like SQL server database mail, SQL Mail, SITE Maestro and site Scope.

Tools in day to day use

SQL server Management Studio (SSMS), SQL Server Integration services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Replication monitor, SQL Enterprise Manager, SQL Query Analyzer, SQL Profiler, Red-Gate SQL Compare, LiteSpeed for SQL Server, MS Visual Studio.net 2003, Visual Studio Express edition.

Development languages and environments

Transact-SQL, PL SQL, Visual Basic.net, VBScript, Jscript, ASP.Net, ActiveX Scripting, DTS, SSIS



Office of Treasure Technology

Trenton, NJ
Application and SQL Developer & Dev DBA
March 2005 – July 2005

Business Environment

Project is to create a website for accepting proposals/bids from various vendors all over the state, in regards with various ongoing and new projects. And help the management to pick the best proposals automatically with ability to intervene the process when ever needed. Website also responds to the vendor and stores the history of the transactions of each particular vendor. This project is also integrated with the NJ division of taxation for various reasons.

Responsibilities

Extensively involved in writing stored procedures, ActiveX Scripts, Audit traces, cursors, developed reports and Migrated Development environment to testing and production environment. Then conducted User Acceptance Testing and rewritten code for Performance Tuning. Created Complex DTS Packages dealing with sensitive data for migration between MS SQL Server database and other data sources MS Excel, CSV and Tab delimited files from FTP locations. Create jobs that automatically execute the DTS packages and sends email notifications to the administrators with the status. Created automatically running Stored Procedures and trigger for the day end operation using SQL Server agent. Created automatically running traces using the Profiler and various scripts, these traces are used for auditing purposes.

Environment

MS SQL Server 2000, Query analyzer, Enterprise manager, Profiler, DTS Designer and Windows XP pro in .Net 1.1 Environment with 3-tier architecture and a later implementation of web services.



Hamburger One LLC

Teaneck, NJ
Windows Web Server admin, SQL DBA/Dev and web Developer
Jan 2004 - Feb 2005

Business Environment

It is a professional management firm offering backbone support to affiliated professional service companies serving the needs of the securities industry. My project here is to administer, support and manage the front office system and document management system, with a focus on business process management. I had to closely work with business to help make strategic business decisions in the technological division of the company.

Responsibilities

Responsibilities include but not limited to Windows 2000 server administration and workstation(s) maintenance, SQL server 2000 administration, Administration and User Interface design and customization of TimeMatters5.0/6.0 software. Firm wide tech support and trouble shooting. Involved in Website Development, created web pages using DHTML4.0, HTML, XHTML1.0 and Java Script. Gathered the user requirements, wrote technical specifications and documentation. Developed and customized reports. I have Created DTS Packages to migrate data between SQL2k database and MS Access/MS Excel. I have Performed Daily differential Back up and weekly complete backup and performed data Recovery/Restore when needed and conducted user Acceptance Testing.

Environment

Microsoft Windows 2000, MS SQL Server 2000, TimeMatters5.0/6.0 (front office system), FrontPage, coffee cup html, SWISH 2.0, Windows NT/XP, VISIO



SRG Inc

Iselin, NJ
SQL server and .Net Developer
Nov 2003 – Jan 2004

Learn and contribute to the ongoing in-house employee database project, involved in Design and development of an employee Database. Created & managed the Database objects, Performance Tuning, Indexing, and Query Optimization. I Performed DB Backup and Recovery. Trained to Install, upgrade, Configure, Maintain and deploy db server.

Environment

MS SQL Server 2000, IIS, Visual Studio, Windows NT/2000



SSI

TN, India
Database Development/Maintenance staff (Intern)
Jun 2001 – Dec 2002

I was involved in database design and development and maintenance of a payroll database, also performed unit and system testing of the developed applications. I’ve created user manuals and technical documentation, also trained user with new application.

Environment

MS Access, Visual Studio 6.0, WinNT.