Skip to main content
Question

How to perform Standard Zernike Analysis across multiple files simultaneously using Zemax Python API and Multiprocessing

  • May 14, 2025
  • 1 reply
  • 47 views

Daniel Dominguez

Hi all - As the title states, I’m attempting to perform a Standard Zernike Analysis across multiple files simultaneously using the Zemax Python API and Multiprocessing (running 4 instances of Zemax). My attempt is shown below. Failures I’ve encountered include: 

*** FRU__delta_init(): Attempt to start when running!

and

UnboundLocalError: local variable 'zos' referenced before assignment

I’m not sure if my approach is flawed or if I’m just attempting something that’s not allowed with Zemax or something else. Any insight you can provide would be greatly appreciated. Thanks!

 

Also, thank you to David Nguyen for his example code on performing a Zernike Standard Analysis here.

if __name__ == '__main__':
    import multiprocessing as mp
    import os
    import glob
    from ShortZernMonolith import ShortZernMonolith

    documents_path = os.path.join(os.path.expanduser("~"), "Documents")
    directory_path = documents_path + "\\SenseCheck"

    # Create list of TSAV*.zmx files found in folder
    TSAV_Files = []
    for files in glob.glob(directory_path + '\\TSAV*.zmx'):
        TSAV_Files.append(files)

    # Start 4 processes of ZernCompute
    with mp.Pool(4) as p:
        p.map(ZernCompute, TSAV_Files)

def ZernCompute(ZemaxFile):
    from PythonStandaloneApplication import PythonStandaloneApplication
    import os
    import pandas as pd

    # Open communication with Zemax
    zos = PythonStandaloneApplication()

    # load local variables
    TheSystem = zos.TheSystem

    # Define folder to store files
    documents_path = os.path.join(os.path.expanduser("~"), "Documents")
    QZ_path = documents_path + "\\QuickZern"
    basefileName = os.path.basename(ZemaxFile)
    fileName = os.path.splitext(basefileName)

    # Specify fields, waves, surface and zernike terms of interest
    field = [1, 3, 7, 11]
    wave = [1, 2, 3]
    surface = 96
    zTerms = [4, 6, 7, 11]
    vertex = False
    sub_x = 0.0
    sub_y = 0.0
    sub_z = 0.0
    eps = 0.0
    maxTerms = max(zTerms)

    # Initialize variable list to store Zernike results
    zernResultsFile = []

    # Load desired ZemaxFile
    TheSystem.LoadFile(ZemaxFile, False)

    # Open a Zernike Standard Analysis
    ZernikeStd = TheSystem.Analyses.New_ZernikeStandardCoefficients()

    # Change settings and compute Zernike Standard Analysis
    for j in range(len(wave)):
        for k in range(len(field)):
            Settings = ZernikeStd.GetSettings() # Get analysis settings

            # Apply Settings
            Settings.Field.SetFieldNumber(field[k])
            Settings.Surface.SetSurfaceNumber(surface)
            Settings.Wavelength.SetWavelengthNumber(wave[j])
            # Settings.SampleSize = Samp
            Settings.ReferenceOBDToVertex = vertex
            Settings.Sx = sub_x
            Settings.Sy = sub_y
            Settings.Sz = sub_z
            Settings.Epsilon = eps
            Settings.MaximumNumberOfTerms = maxTerms

            # Run analysis
            ZernikeStd.ApplyAndWaitForCompletion()

            # Send Zernike results to text file
            tempFile = QZ_path + f'\\ZernTempFile{fileName[0]}-{j}{k}.txt'
            ZernikeStd.GetResults().GetTextFile(tempFile)

            # Open and read the text file
            with open(tempFile, 'r', encoding='utf-16') as F:
                Lines = F.readlines()

            ZernResults = []

            for Index in range(38, len(Lines)):
                SplitLine = Lines[Index].split()
                zCheck = str(SplitLine[0])
                TermCheck = int(SplitLine[1])
                Value = float(SplitLine[2])

                # Filter for desired Zernike Terms
                for i in range(len(zTerms)):
                    if zCheck == 'Z' and TermCheck == zTerms[i]:
                        ZernResults.append(Value)

            #os.remove(tempFile)
            zernResultsFile.extend(ZernResults)

    ZernikeStd.Close() # Close the analysis
    print(f"Analysis of {ZemaxFile} complete.")

    df = pd.DataFrame(zernResultsFile).T #Change list to dataframe

    # Write data from zernResultsFile to CSV
    ZernResultsCSV = os.path.join(os.sep, QZ_path, f'{fileName[0]}-ZernResults.csv')
    df.to_csv(ZernResultsCSV, mode='a', index=False, header=False)

    del zos

 

1 reply

chaasjes
Forum|alt.badge.img
  • Visible
  • 37 replies
  • June 2, 2025

Hi,

Good to see you create separate OpticStudio instances in the separate processes, as the ZOS-API does not allow to connect to multiple OpticStudio instances from the same process. This is a common pitfall when trying to parallelize code utilizing the ZOS-API.

However, there are multiple things that may cause issues here:

  • A new OpticStudio instance is opened for every call to ZernikeCompute, which may reduce or eliminate the advantage of using multiprocessing;
  • You attempt to delete zos at the end of ZernikeCompute, but the zos object is still referenced in TheSystem. This will cause the zos reference  to be deleted, but the object itself will not be deleted until TheSystem can be cleaned up. Luckily, this is likely to happen when the function returns. Still, the UnboundLocalError may be related to this, but that's hard to find out if you do not supply stack traces.

It would be more efficient to create stateful processes that create a new OpticStudio instance when initialized, and then use the same OpticStudio for all calculations in that process. This is not possible with Python's multiprocessing library, but I've successfully done this with Ray actors.

Finally, I recommend taking a look at ZOSPy, an open-source Python library for communication with OpticStudio. ZOSPy includes ready-to-use parsers for a number of OpticStudio analyses, and supports the Zernike Standard Coefficients analysis.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings