Lincoln Randall McFarland

a.k.a. Randy

Updated: 2021 Dec 27
Contact Information
  • Web:
  • Email:
  • Phone: (650) 906-4958
  • Location: Mountain View, CA
  • Education: B.A., Physics, University of California at Berkeley, 1985.
Computer Experience

I enjoy creating new software tools and making old ones better. To me that means finding the way to use the language to describe the problem you are trying to solve that is as clear to the machine that runs it as it is to the programmer who comes after to maintain and, hopefully, build on it. I like to build good test automation. It frees you to quickly make big changes with out fear of breaking something and not knowing about it until Murphy's Law tells you.

Most of my experience is with Python and C/C++. I have a good working relationship with JavaScript, Java, git, SQL languages (PostgreSQL and MySQL), no-SQL languages (mongo), and various Unix shell scripts. Examples of my code are available in my github repo: GitHub/lrmcfarland.

Work Experience
Mainspring Energy
Senior Software Engineer
Software and Controls
August 2021 - present
I added blob size and device id to our BigQuery data sets with terraform. I extended find and list to use device id in the python tools that monitor our generators on AI Notebook. I created a BigQuery loader tool in python to load data from generator gateway telemetry files to backfill missing steaming data. I created a streaming telemetry monitor as a GCE cloud service using a python flask server to alert devops if we have not seen telemetry from a generator after a specified time.
Senior Engineer
November 2020 - April 2021
I crossed a fluentbit golang plugin template with a Google BigTable “hello world” and created my first golang product without knowing golang, fluentbit or BigTable when I started. I had them all building in a docker container by the end of the week. Knowing c was very helpful though.
I created alpine python docker container with a flask server that reads a JSON config file of git repos to load and maintain web tracking data in a Redis database. I also created a configuration server, again using alpine python with flask in a docker container, to monitor a Postgres database for changes in one process and return the latest configuration data via an http API in another. I also added a JavaScript class to collect whois information from various sources and normalize the different formats.
I lead the Okta integration effort with our scanner product to protect our routes with OAuth2 authentication. I also added JavaScript functions with axios to send Salesforce notice of new accounts and use their email template API to reply to users with links to their scan results along with our marketing information. I also created JavaScript and python test examples along with documentation about how this works in our environment.
Sr. Development Engineer
CTO Organization
February 2017 - November 2020
I joined vArmour to help extend their Distributed Security System (DSS) to support micro-segmentation of container networks. I created a simple alpine-python based flask server I called squawker. This ran first in a container but later I "back ported" this to a CentOS VM to test with our original product. The squawkers have an API that let me POST a JSON list of other squawkers to talk to each other in several protocols: http, sftp, iperf3 and the ability to easily add anything with a Python client.
I created a pip package of Python requests based DSS clients to interact with the various RESTful APIs in our products to support maintenance tasks, customer updates and testing. I combined this with pytest to script a full system test setup: labeling the squawkers to match a customer configuration; applying a policy that used these labels to demonstrate real time policy violations on demand; verified the policy was enforced correctly and clean it all up at the end to be ready for the next test. I added pytest-benchmarks to measure and record performance for a number of test configurations. I also created a set of containers to set up a standard test environment for locally mounted code under development.
I wrote a set of developer how-to guides to build our test setups as well as the first pass of the customer facing technical documentation. And generally answered questions, reviewed pull requests, mentored interns and kibitzed on coding problems.
SilverTail Systems (now EMC/RSA/Dell)
Principal Software Engineer
Research and Development
November 2011 - January 2017
For the SilverTail product I created a collection python tools to analyze and synthesize network data, from HTML log files to tcpdump pcap output. I used python libraries, like scapy, and TCP/IP tools like Wireshark create test data. The test data sets are used to debug the scoring algorithms, test the limits of efficacy, measure performance, support unit testing and create demos.
Our development environment is Agile, test driven development and object oriented. It is mostly in C++ and Python. I also implement algorithms, fix bugs and mentor developers new to Python.
For the threats research group I analyzed data collected by our product in our priority log format. I wrote several Python scripts to parse this format into Python data structures that can then insert this into either a normalized relational database like Postgres or a no-SQL database like MongoDB. With indexed access to the transaction data, we are able to develop new ideas about what other statistical measures are available and how effectively they can contribute to a potential score, balancing a low false positive rate with a high false negative one. And, of course, having actual data helps develop better synthetic models for validation and testing.
I also created a code check in manager, Captain Hook, for our continuous integration process based on the python Tornado framework to help ensure the pull review included successful build and test results before allowing the merge to proceeded. This involved responding to check in events from the GitHub web hook API and sending commands to the Jenkins API to collect build and test results.
Sr. Software Engineer
Back-end Infrastructure
April 2011 - November 2011
I created several Python daemons to support CDNetworks back-end infrastructure. This included a core library and the scripts that use it (a customers daily usage calculator for billing and a DNS bind parser). I wrote the CDNetwork's Python style guide (a slightly customized version of PEP8). I also documented the design, wrote the user's guides and work closely with QA to validate the code worked as intended.
I wrote an object oriented DNS bind parser to support our zone transfer product and integrated it to our database using the GUI's Django models and forms. The OO design made it simple to apply our customizations to processing the data and adapt to new requirements as they were discovered.
I wrote a simple Python daemon using the multiprocessor module to efficiently parse our log data files into a round robin database (RRDTool). This included Python scripts to synthesize test data for performance measurements on the input and a simple daemon to generate the json format required for display on the GUI along with a threaded Python http server to deliver it. I also created the bash shell wrappers to manage this.
IronPort Systems (now Cisco Systems)
Software Engineer
Security Applications
April 2005 - April 2011
IronPort makes a email server appliance. For that, I developed the third generation of our Web Based Reputation Service (WBRS) product used by our web appliances. I wrote the functional and design specs and developed a tool kit of Python/MySQL scripts to generate the reputation updates, test their efficacy and debug their contents. I have implemented a Python based rule weight evaluation utility that applies a gradient descent algorithm to our phone home data to find the optimal set of rule weights.
Prior to that I led the development of the 2.0 release of our Sender Base Reputation Service (SBRS) product, a DNS service used by our email appliances. When I started at IronPort, I worked on our "corpus", a database of spam for use with the IronPort Anti-Spam (IPAS) tool.
For these products, I was responsible for writing the specs, code, user's guides and other documentation, coordinating the contributions from other engineers, working with QA to develop test tools and methodology and resolve the bugs that are found.
For the corpus development, I created a set of rc.subr daemons using Python that processed incoming email from our traps by sending it through our scanning engines and extracting the results for storage in our database, a.k.a. the corpus. IPAS pulls a set of test email from the corpus for nightly scoring to determine an optimal set of rules to be pushed to our customers' IronPort mail servers. As the corpus progressed through its 2.0 release, I led the development effort to hand off the maintenance and further development to our Ukrainian contractors. I wrote the functional specifications and the user's guides for new developers, QA engineers and system administrators.
My initial work on SBRS was to do the planned re-factoring and prepare the code base for a 2.0 release. I surveyed the code tree, pruned many dead branches (reducing the code line count by 60%), created the user's guides (increasing their line count by 100%) and updated the configuration process to use our newest tools, while preserving the underlying data structures (mostly in the MySQL schema) to reduce risk. Once we were confident that the process was clearly understood, updating the data structure became the focus of the 2.0 release. I worked closely with QA and system administration to provide them with the tools they need to monitor the system and verify it is functioning correctly as well as provide documented procedures about what to do if it is not.
The QSS Group at NASA Ames
Sr. Software Engineer
Information Physics Group
September 2003 - March 2005
I implemented a new computational framework for atmospheric and surface remote sensing, called CSFSR (Classification of Spectral Features in the Solar Radiation), for the Information Physics Group. I also worked on extending the Signal Processing Environment for Application Development (SPEAD) tool kit for the Neuro Engineering Lab.
The CSFSR is a largely C++ test framework that was used to look for the optimal solution to the most likely mix of gases (O3, O2, CO2, NO2 and H2O) seen in a high spectral resolution satellite image of the earth's surface. It combined solar radiation data with HITRAN data about how these gases absorb light in the atmosphere and used the standard Fortran program DISORT to analyze an image. We experimented with several techniques to find the optimal solution, including simulated annealing and gradient descent. I was responsible for implementing the application using equations provided by the physicists in the information group. I also created the C++ wrappers for the Fortran functions in DISORT to link them directly to CSFSR, eliminating the need to parse DISORT's normal text output and greatly increasing the speed of processing. I also built several test harnesses to validate the accuracy of the model.
The SPEAD tool kit is written using the Qt tool kit. I added several signal processing modules, including simulators of a simple sine wave signal generator and mixer along with a spectrum analyzer and oscilloscope. I also created a qmake file builder language along with a Python script to process it for generating the makefiles needed by Qt to build SPEAD.
The SETI Institute
Sr. Software Engineer
The Phoenix Group
August 2000 - August 2003
I joined the SETI Institute to work on Project Phoenix's Search System Executive (SSE) for the New Search System (NSS), the continuation of the NASA program to observe stars within 200 light years for radio signals. I wrote many applications to support the observation, from the control interface to the telescopes through to the to the database to store the results.
I built on my previous experience with controlling RF equipment using the GPIB bus (IEEE-488) including tuning local oscillators, setting step attenuators and switches, generating test signals and monitoring system status. To simplify configuration and add flexibility and maintainability into how observations were programmed, I created a C++ library of the equipment and wrapped this with SWIG to create a simple command interface. I turned this into a simple TCP/IP server by using Tcl's socket library to process strings sent to a socket. This allowed a client as simple as telnet to send commands to the server. This also made it easy to use Expect to create a suite of QA regression tests to validate the server.
I was also fortunate to have the opportunity to do many things I had not done before. Working mainly from the Rubini Linux Device Drivers book, I wrote the device drivers for two custom PCI boards used to process the signals and monitor status. I created the MySQL schema and the C++ and Java APIs to store the test input parameters and results of the observation, which included many thousands of signals, all RFI from things like ships radars and cell phones. I also created a Java SWING application to provide a GUI for the database to make it easier for the astronomers to access the data and generate reports.
However, the most fun was to participate in the observations with the astronomers at Arecibo and Jodrell Bank. I wrote a common interface to both of the observatory's telescope pointing controls and did on site setup and debugging of our hardware and software.
Frequency Technology (now Sequence Design)
Sr. Software Engineer
August 1998 - August 2000
At Frequency Technology I worked on the Columbus product, a tool for creating a SPICE model of the parasitic capacitance in the interconnect circuits of integrated circuit designs. I developed several ports of the source code base from Solaris to HPUX and IRIX platforms updating the build infrastructure using Rogue Wave's implementation of the C++ STL.
I researched how changes to the mathematics of the model would affect the results, creating several special purpose software tools to accomplish this. I wrote a C++ parser to allow Columbus to read hierarchical SPICE decks and developed a command line option object to simplify setting and accessing configuration information. I also re-factored our Perl build scripts to support builds on the HPUX and IRIX platforms.
Cadence Design
Member of Consulting Staff
Multimedia Group
October 1996 - June 1998
As a member of the multimedia group, I developed several modules, in C++, for our Signal Processing Workbench (SPW) product, a graphical tool kit for constructing models of signal processing systems. I also provided documentation and customer support for installing and running the new modules.
The modules were part of a custom model built for Fujitsu of their JSAT MPEG-2 decoder. These included interfaces to load video data to and from disk files, modules for mixing on screen display information into the video stream using the vertical blanking interval as well as modules to model an asynchronous serial bus and a IC card reader.
Trimble Navigation
Member of Technical Staff III
Land Survey
August 1996 - October 1996
I wrote makefiles to build the source code generated by Rational Rose for the TrimTalk communication product.
TIW Systems (Now Vertex RSI)
Sr. Software Engineer
January 1994 - August 1996
I developed the software (C++/Tcl on Unix) for our in-orbit test (IOT) system of satellite transponders. I was also responsible for installing and verifying the equipment at the customers facilities (in China, Italy, Luxembourg, Virginia and Wyoming).
The IOT consisted of a suite of tests, often customized to meet customer specific requirements, that measured the performance of a transponder once the satellite was in its working orbit. These tests ran on a Unix work station (HPUX and Linux) by sending commands over custom TCP/IP client-server applications and via the GPIB bus (IEEE-488), to signal generators. The return signal was measured with a spectrum analyzer or RF power meter and the results were stored in a relational database. I developed a C++ library for the instruments we used which allowed us to mix and match hardware to quickly address customer customizations. I also wrote the schema for the database tables.
The test were used on SES Astra's 1D, ChinaSat's DFH-3 and EchoStar's EchoStar1 satellites. They measured the satellite transponder's local oscillator, equivalent isotropically radiated power (EIRP), saturation curve, frequency response, G/T, spurious emissions, and inter-modulation characteristics to name a few.
Lockheed Missiles and Space Company (Now Lockheed Martin)
Sr. Research Engineer
Algorithm Development Group
May 1986 - January 1994
I started at Lockheed in the electromagnetic compatibility (EMC) group running Fortran computer models of how noise gets into electronic systems. I developed C applications to first analyze the data and then extend the range of the models. I finished in the Algorithm Development Group developing a signal processing model to show the effect of various signal recover techniques.
My first job at Lockheed was to collect the data for and run an industry standard Fortran computer model (IEMCAP) of cable bundles in spacecraft designs with regard to electromagnetic compatibility (EMC), a.k.a. cross-talk. I used the results of these models to show our flight hardware met the MIL-STD-461 requirements or when it didn't, determine whether it was safe to grant a waiver or not. I was also responsible for observing the hardware test in the EMC lab. Working with Tempest engineers, I developed a new application in C, based on a set of equations in an IEEE paper that would allow the model to calculate the cross talk at the much higher frequency requirements of Tempest.
In the Algorithm Development Group, I wrote the X11/Motif GUI for our signal processing tool kit (CWID). I also implemented many of signal processing algorithms. This application served as a test bed for developing new techniques in continuous wave applications, interference rejection, peak detection. After leaving LMSC, I continued to develop signal processing tools like the java applets on my web page ( See DSP made simple and Make Waves with FFTs). I also received hands on experience with the signal generators, spectrum analyzers and oscilloscopes in our lab.
Energy Auditor and Retrofitter (now Home Energy)
Contributing Editor
January 1984 - May 1986
I was responsible for producing articles on various aspects of energy conservation in residential housing. This included researching the topic, interviewing people involved with the technology, writing the article and preparing the magazine for publication and distribution.
I was involved in getting the magazine started. I worked on everything from figuring out how to use nroff to do our type setting, to building a database of subscribers and writing the excel macros to print labels for mass mailings. I wrote articles on energy conservation including the advantages of compact fluorescent light bulbs and the results of calorimeter measurements I did on the efficiency microwave ovens to name two. I presented a paper on desk-top publishing at the 1986 ACEEE conference. I also worked as a teaching assistant for Energy and Resources Physics class at U.C. Berkeley.