How I Tested an FPGA-based BATS Parser using Python

LabVIEW FPGA Implementation of BATS PITCH

I implemented a CBOE / BATS Market Data Parser by using LabVIEW FPGA, following the specification from the exchange:

to be more precise

  • the parsing of most of the message types (no Options support) using LabVIEW FPGA
  • a LabVIEW Host (non-FPGA)  parser and message generator to test my implementation
  • a Python parser and message generator to validate:
  • an exported Xilinx Design Checkpoint with VHDL Wrapper generated by using the LabVIEW FPGA IP Export Utility to bring my LabVIEW FPGA implementation into a Vivado project
  • I used the pysv Python library to call my python code from a SystemVerilog test bench
  • Ran a simulation that tested the BATS Parser using messages generated from Python

My idea and goal was to help bridge the gap between LabVIEW FPGA and Vivado.  In doing so I also learned that I could also go one step further and bridge LabVIEW FPGA to Python, allowing me to leverage the thousands if not millions of already existing libraries to help write tests faster for my LabVIEW FPGA code, all while staying inside Vivado.

Source Code

Check out the source code for this on github:


Python implementation

My Python implementation of many/most message types:

pysv library

I used the pysv open-source library for calling Python code to generate The Python to SystemVerilog library that I used:

So how does it work?

pysv takes your Python code, generates a DPI library along with some bindings and then you link the resulting library into your SystemVerilog/Simulation.

To summarize, especially for all the non-LabVIEW FPGA users, here is the entire procedure (I will have to make a video at some point):

  1. LabVIEW FPGA IP Export takes your LabVIEW FPGA code, and generates an encrypted Netlist packaged as a Design Checkpoint File (DCP) coupled with a VHDL wrapper-file that instantiates the IP
  2. I create a Vivado project and add the .dcp and .vhdl wrapper and a simulation / test-bench that uses SystemVerilog
  3. I run Synthesize and then run a Post-Synthesis Functional Simulation or a Post-Synthesis Timing Simulation
  4. I browse to my project on local disk and find the directory that contains the following 3 shell scripts:
  5. I copy these files along with the associated .prj files to my own sub-directory, do some surgery to these files and integrate them into my own Makefile.
    1. To summarize,  you have to add the shared library generated by pysv to the file (xelab)
  6. Then I modify my SystemVerilog to include the Python-to-SystemVerilog binding file and to import everything from that same file
  7. Then I add some tests that use the Python code (available as DPI functions) to generate my input data.
  8. Run simulation… I am sure that if I use a Post-Synthesis Timing Simulation that I can even use this code for benchmarking

More Details

So how can you use my code as a model to do this yourself? Follow along and match up each step # with the step #’s listed above.

  1. You are gonna need to know LabVIEW FPGA for this one and to have a license (or trial)
  2. ditto… but a Verilog and/or VHDL Vivado expert
  3. See my modified versions:
    1. (
      1. – runs xvlog with project file
      2. bats_parser_tb_vlog.prj
        1. Add pysv wrapper ‘’
    2. (
      1. Add link to pysv generated library, i.e.
        1. -sv_lib ../build/libpysv
    3. (
      1. No changes required, but for convenience I added:
        1. -onfinish quit
        2. -onerror quit
  4. See my file structure here:
    1. <root>
    2. Makefile
    3. ./ip_export/
        1. Functions and types that you want to be exposed to your SystemVerilog test bench.  Notice how I have defined my own wrapper around a Python List as this feature was not available yet in the pysv library when I coded this up.  Each function has to be wrapped with the @sv() decorator and you define each type that will be used. You can see a list of all available types in the source code of pysv here:
        1. Always write tests… makes figuring things out easier.
        2. For the sake of brevity, see the TestSeqUnitHdr class.
        1. I put a huge comment where the start of my tests of just using the pysv functions start “Testing PYSV”, and Test #4 is where I pass in a simple Sequenced Unit Header with just a Time message encapsulated inside it.
  5. Note that I tested this using python3.11 with pysv version 0.2.0, there is a breaking change in later versions, I just haven’t had the time to figure out exactly what broke. (Something to do with pybind11, my version of cmake and gcc) So I upgraded to python3.11 and pysv latest version, after doing some digging I realized that I cloned the pysv repository without using the git ‘–recurse-submodules’ option.  pysv has two submodules – pybind11 and vlstd – and since I was installing pysv from source, these submodules were not available and was the reason I was getting an erro.

In summary, I was able to write my first test of my IP written using LabVIEW FPGA by generating the test data using Python.  Next I will see how I can extend this to use a Post-Synthesis Timing Simulation to see if I can use it for benchmarking purposes.  And of course… a video version of this.

Leave a Comment