This quick start guide enables the reader to setup the environment for compiling and executing the openPOWERLINK Linux MN demo for the Zynq Hybrid design using Vivado 2016.2 toolchain.

1. Hardware Requirements

The hardware items required to run openPOWERLINK Linux MN demo for the Zynq Hybrid design on Zynq ZC702 are as follows:

  • Zynq ZC702 board (used as the openPOWERLINK MN) – 1

  • AVNET expander board (AES-FMC-ISMNET-G) – 1

  • Linux PC – 1

  • Micro SD card reader – 1

  • Micro SD card – 1

  • Ethernet cables – 1

  • 1 Mini USB serial cable – 1

2. Software Requirements

The software items required to run openPOWERLINK Linux MN demo for the Zynq Hybrid design on Zynq ZC702 are as follows:

  • Ubuntu 14.04

  • Vivado – 2016.2

  • CMake v2.8.7 or later version

    • Install Cmake and Cmake GUI using the following commands:
      • sudo apt-get install cmake

      • sudo apt-get install cmake-gui

  • Xilinx Linux (https://github.com/Xilinx/linux-xlnx)

    (Note: After cloning, use the following command to checkout the branch required for Vivado 2016.2 toolchain)

    • git checkout -b zynq-build xilinx-2016.2

  • sudo apt-get install libncurses5-dev (Note: Used with menuconfig)

  • sudo apt-get install u-boot-tools (Note: To create uImage)

  • Download RT Preempt 4.4-rt2 version for the Xilinx-Linux from the below link:

  • Download the openPOWERLINK stack from the below link:

      • Change directory to the downloaded stack

      • Checkout the 2.5.0 branch or later using the following command

        • git checkout <branch_name>

3. Steps to apply RT Preempt patch to the Linux kernel source

This section describes the steps to be carried out on the Linux PC to apply the RT Preempt patch to the Xilinx Linux kernel sources and compile the kernel.

  • Open terminal and move to the Xilinx Linux directory

    • cd <Xilinx_Linux_dir>

  • Apply the patch using the following command

    • patch -p1 < <(gunzip -c “path_to_patch-4.4-rt2.patch.gz”)

Figure 1: Apply the RT Preempt patch to the Linux source code

3.1 Steps to compile the Linux kernel source for Zynq ZC702

This section describes the steps to be carried out on the Linux PC to compile the Linux kernel source and create the kernel image file for Zynq ZC702.

  • Export the cross compilation environment variables using the following command

    • export CROSS_COMPILE=arm-linux-gnueabihf

  • Configure the Linux kernel parameters using the default Zynq configuration file

    • make ARCH=arm xilinx_zynq_defconfig

Figure 2 : Configure the Linux kernel parameters using Zynq default configuration

  • Compile the kernel using the following command

    • make ARCH=arm CROSS_COMPILE=<Xilinx_dir>/SDK/2016.2/gnu/aarch32/lin/gcc-armlinux-gnueabi/bin/arm-linux-gnueabihf-

  • To create uImage file

    • make ARCH=arm UIMAGE_LOADADDR=0x8000 uImage CROSS_COMPILE=<Xilinx_dir>/SDK/2016.2/gnu/aarch32/lin/gcc-armlinux-gnueabi/bin/arm-linux-gnueabihf-

  • Compile and install the modules for the Linux kernel using following commands

    • Compile module

      • make ARCH=arm CROSS_COMPILE=<Xilinx_dir>/SDK/2016.2/gnu/aarch32/lin/gcc-armlinux- gnueabi/bin/arm-linux-gnueabihf- modules

    • Install module

      • make ARCH=arm CROSS_COMPILE=<Xilinx_dir>/SDK/2016.2/gnu/aarch32/lin/gcc-armlinux- gnueabi/bin/arm-linux-gnueabihf- modules_install

Figure 3: Compile and Install modules for the Xilinx-Linux kernel

4. Steps to build the Hardware for Zynq Hybrid design
  • Change the path to Xilinx Vivado directory

    • cd /Xilinx_dir/Vivado/2016.2/bin

  • Open Vivado TCL console 2016.2

    • ./vivado -mode tcl

Figure 4: Open the Vivado tcl console

  • Execute the following commands

    • xsct

    • vivado -mode tcl

    • vivado -mode batch

Figure 5: Set the SDK environment path

  • Change directory

    • cd <openPOWERLINK_dir>/hardware/build/xilinx-microblaze

  • Execute the command

    • cmake -GUnix\ Makefiles -DCMAKE_TOOLCHAIN_FILE=../../../cmake/toolchain-xilinx-microblazegnu. cmake ../..

  • Execute the command

    • cmake ../.. -DCMAKE_BUILD_TYPE=Debug -DSKIP_BITSTREAM=OFF-DDEMO_Z702_MN_DUAL_SHMEM_GPIO=ON

Figure 6: Configure cmake to build the hardware for zynq hybrid design in Debug mode

  • Execute the command

    • make install

Figure 7: Hardware build in Debug mode

  • Repeat the steps above to build in Release mode by setting CMAKE_BUILD_TYPE as ‘Release’

Figure 8: Configure cmake to build the hardware for zynq hybrid design in Release mode

  • Execute the command

    • make install

Figure 9: Hardware build in Release mode

5. Steps to build the PCP

This section describes the steps to be carried out on the Linux PC to compile and build the PCP for the Zynq Hybrid design.

  • Change directory

    • cd <openPOWERLINK_dir>/stack/build/xilinx-microblaze

  • Execute the command

    • cmake -GUnix\ Makefiles -DCMAKE_TOOLCHAIN_FILE=../../../cmake/toolchain-xilinx-microblazegnu.cmake ../.. -DCMAKE_BUILD_TYPE=Debug -DCFG_COMPILE_LIB_MNDRV_DUALPROCSHM=ON

Figure 10: Configure driver library for microblaze in Debug mode

  • Execute the command

    • make install

Figure 11: Build the driver library for microblaze in Debug mode

  • Repeat the steps above to build in Release mode by setting CMAKE_BUILD_TYPE as ‘Release’

Figure 12: Configure the driver library for microblaze in Release mode

  • Execute the command

    • make install

Figure 13: Build the driver library for microblaze in Release mode

5.2 Steps to the build driver application for Microblaze

  • Change directory

    • cd <openPOWERLINK_dir>/drivers/xilinx-microblaze/drv_daemon/build

  • Execute the command

    • cmake -GUnix\ Makefiles -DCMAKE_TOOLCHAIN_FILE=../../../cmake/toolchain-xilinx-microblazegnu. cmake ../.. -DCMAKE_BUILD_TYPE=Release -DCFG_BUILD_KERNEL_STACK=PCP\ Daemon\ Dual-Proc -DCFG_HW_LIB=xilinx-z702/mn-dual-shmem-gpio ..

Figure 14: Configure the driver application for microblaze in Release mode

  • Execute the command

    • make install

Figure 15: Build driver application for microblaze in Release mode

  • Exit from Vivado TCL console
6. Generate FSBL
  • Change directory

    • cd <Xilinx_dir>/SDK/2016.2/bin

  • Execute the command

    • sudo ./xsdk

Figure 16: Create a workspace in SDK environment

  • Select an existing workspace or create a new workspace

  • Click File->New->Application project

  • Create a new application project and enter a project name

  • Click on New under target Hardware

Figure 17: Create a new application project

  • Browse the hardware file path “<openPOWERLINK_dir>/hardware/lib/generic/microblaze/xilinxz702/mn-dual-shmem-gpio/hw_platform/system.hdf”

  • Click Finish to proceed

Figure 18: Select the hardware platform

  • Ensure that the OS platform is ‘standalone’ and the processor is ‘ps7_cortexa9’ in the application project window

  • Click Next to proceed

Figure 19: Select the processor in Target hardware platform

  • Select Zynq FSBL and click Finish

Figure 20: Generate Zynq FSBL

  • fsbl.elf is generated in the debug folder of the Xilinx SDK workspace

  • Exit from SDK workspace

7. Generate BOOT.bin
  • Open terminal and change directory

    • cd <openPOWERLINK_dir>/tools/xilinx-zynqvivado

  • Copy all the required binaries to “<openPOWERLINK_dir>/tools/xilinx-zynqvivado”

  • Files required for creating boot.bin

    • fsbl.elf (from <Xilinx_SDK_workspace>/<project_name>/Debug/)

    • download.bit (from <openPOWERLINK_dir>/bin/generic/microblaze/xilinx-z702/mn-dualshmem-gpio)

    • u-boot.elf (from Zynq ZC702package http://www.wiki.xilinx.com/Zynq+2016.2+Release)

    • oplkdrv-daemon_o.elf (from /bin/generic/microblaze/xilinx-z702/mndual-shmem-gpio)

  • Execute the command

    • <Xilinx_dir>/SDK/2016.2/bin/bootgen -image bootimage.bif -o i boot.bin

Figure 21: Generate boot.bin

8. Generate device tree blob
  • In terminal, change directory to the device tree source path using the following command

    • cd <openPOWERLINK_dir>/hardware/boards/xilinx-z702/mn-dual-shmemgpio/sdk/handoff/

  • DTC is part of the Linux source directory. “<Xilinx_Linux_dir>/scripts/dtc/” contains the source code for DTC and it needs to be compiled in order to be used

  • Build the DTS using the following command

  • <Xilinx_Linux_dir>/scripts/dtc/dtc -I dts -O dtb -o devicetree.dtb system.dts

Figure 22: Generate device tree blob

9. Steps for cross-compiling the openPOWERLINK Linux MN for the Zynq Hybrid design

This section describes the set of steps to cross compile the openPOWERLINK Linux MN Zynq ZC70 for the Zynq Hybrid design

  • Set the Xilinx Vivado environment by executing the following command

    • source <Xilinx_dir>/Vivado/2016.2/settings64.sh/

  • Open cmake-gui

Figure 23: Open cmake-gui

  • To compile the stack libraries

    • Point the “Where is the source code” to the stack source folder “<openPOWERLINK_dir>/stack”

    • Point the “Where to build the binaries” to the stack build folder “<openPOWERLINK_dir>/stack/build/linux”

    • Click the “Configure” button

    • In “Specify the Generator for this project” dialogue box, select “Unix Makefiles” generator and select “Specify toolchain file for cross-compiling” and click Next

Figure 24: Specify the generator for the project

  • Provide the path for “Specify the Toolchain file” as below, “<openPOWERLINK_dir>/cmake/toolchain-xilinx-vivado-arm-linux-eabi-gnu.cmake”

Figure 25: Specify the toolchain file for cross-compiling

  • Select the “CFG_COMPILE_LIB_MNAPP_ZYNQINTF” to build MN library

Figure 26: Specify the compiler library for Zynq MN

  • Click “Configure” to apply the settings and click “Generate” to create makefile with the modified configuration

  • Change directory to the stack build path

    • cd <openPOWERLINK_dir>/stack/build/linux

  • Use the following command to compile the stack

    • make install

Figure 27: Build the stack libraries for Zynq Arm in Debug mode

  • Repeat the above steps to build in Release mode by setting CMAKE_BUILD_TYPE to Release

Figure 28: Configure the stack libraries for Zynq Arm in Release mode

  • Execute the command

    • make install

Figure 29: Build the stack libraries for Zynq Arm in Release mode

  • To compile the driver libraries

    • Provide the “Where is the source code” to the driver source folder “<openPOWERLINK_dir>/drivers/linux/drv_kernelmod_zynq>”

    • Provide the “Where to build the binaries” to the driver build folder “<openPOWERLINK_dir>/drivers/linux/drv_kernelmod_zynq/build>”

    • In “Specify the Generator for this project” dialogue box, select “Unix Makefiles” generator and “Specify toolchain file for cross-compiling”as shown in the above ‘compile stack library’ section

    • In Provide the path for “Specify the Toolchain file” as below, “<openPOWERLINK_dir>/cmake/toolchain-xilinx-vivado-arm-linux-eabi-gnu.cmake”

Figure 30: Specify the toolchain for building the driver libraries

  • Set CFG_KERNEL_DIR to the “<Xilinx_Linux_dir>”

Figure 31: Set the kernel directory for the driver build

  • Select “Configure” to apply the settings and click “Generate” to create makefile with the modified configuration

  • Change directory to application build path

    • cd <openPOWERLINK_dir>/drivers/linux/drv_kernelmod_zynq/build

  • Use the following command to compile the stack

    • make install

Figure 32: Build the driver for Zynq Arm

  • To compile the application libraries

    • Provide the “Where is the source code” to the driver source folder “<openPOWERLINK_dir>/apps/demo_mn_console/”

    • Provide the “Where to build the binaries” to the driver build folder ““<openPOWERLINK_dir>/apps/demo_mn_console/ build/linux”

    • In “Specify the Generator for this project” dialogue box, select “Unix Makefiles” generator and “Specify toolchain file for cross-compiling”as shown in the above ‘compile stack library’ section

Figure 33: Specify the toolchain file for the demo application

  • In Provide the path for “Specify the Toolchain file” as below, “<openPOWERLINK_dir>/cmake/toolchain-xilinx-vivado-arm-linux-eabi-gnu.cmake”

  • Set CFG_BUILD_KERNEL_STACK to “Kernel stack on Zynq PCP”

Figure 34: Configure the kernel stack as Zynq PCP

  • Select “Configure” to apply the settings and click “Generate” to create makefile with the modified configuration

  • Change directory to application build path

    • cd <openPOWERLINK_dir>/apps/demo_mn_console/build/linux

  • Use the following command to compile the application

    • make install

Figure 35: Build the demo application for Zynq MN

10. Steps to execute the openPOWERLINK Linux MN demo application on Zynq ZC702

This section describes the steps to run the openPOWERLINK Linux MN demo on Zynq ZC702 development board

  • Refer the below link to convert the SD card as bootable medium for Zynq http://www.wiki.xilinx.com/Prepare+Boot+Medium

  • Refer the below link to download the Zynq ZC702 2016.2 pre built Linux binaries (Assuming cross compilation for the Linux is done using Xilinx Vivado 2016.2 toolchain). http://www.wiki.xilinx.com/Zynq+2016.2+Release

  • Extract and copy the following content from the downloaded folder to the boot partition of SD card

    • uramdisk.image.gz

    • devicetree.dtb

    • BOOT.bin

    • openPOWERLINK driver and application binaries from

      • <openPOWERLINK_dir>/bin/oplkdrv_kernelmodule_zynq

      • <openPOWERLINK_dir>/bin/demo_mn_console

    • Replace the existing uImage with the cross compiled uImage from

      • <Xilinx_Linux_dir>/arch/arm/boot

  • Hardware setup

    • Connect the Avnet expander board to J3fmc1 connector of the Zynq ZC702 board

    • Now connect the Ethernet cable to any of the Ethernet ports J6/J2 of the Avnet extension board and to the slave of the network

  • To run the openPOWERLINK

    • Insert the SD card into the Zynq ZC702 board

    • Connect USB UART port in the Zynq ZC702 board with the Linux PC

    • Execute the following command to open the minicom terminal

      • sudo minicom -s

Figure 36: Open the minicom terminal

  • Go to serial port setup and click enter

Figure 37: select the serial port setup

  • Change the serial device as per the USB name (Example: /dev/ttyUSB0)

  • Set the hardware flow control settings to “NO”

Figure 38: Configure the serial port setup

  • Save setup as dfl and exit

Figure 39: Save the serial port setup configuration

  • Once the auto-boot finishes, enter the username as “root”

  • Mount the SD card using the following command

    • mount /dev/mmcblk0p1 /mnt/

  • Change directory cd /mnt/oplkdrv_kernelmodule_zynq

    • insmod oplkmnzynqintf.ko

  • Change the directory cd /mnt/demo_mn_console

    • ./demo_mn_console

11. Steps to execute the openPOWERLINK slave demo application

This section describes the steps to be carried out to run the openPOWERLINK Linux CN demo.

11.1 Run the PCAP driver daemon on openPOWERLINK slave

    • Open the terminal

  • Change to the driver daemon install directory on slave PC 1

    • cd <openPOWERLINK_dir>/bin/linux/i686/oplkd-pcap/

  • Run the PCAP driver daemon using the following command:

    • sudo ./oplkcnd-pcap

11.2 Run the application on openPOWERLINK slave

  • Open the terminal

  • Change to the slave application install directory on slave PC 1

    • cd <openPOWERLINK_dir>/bin/linux/i686/demo_cn_console/

  • Run the slave application using following command:

    • sudo ./demo_cn_console

Note: Repeat the above sections 11.1 and 11.2 on the slave PC 2.

Zync User Guide

This document serves as a quick start guide to setup the environment for compiling and executing the
openPOWERLINK Linux MN demo for the Zynq Hybrid design using Vivado 2016.2 toolchain.

We hope we have helped you in getting started with your Deterministic Ethernet journey. Keep in touch! For more assistance, please write to us at enterprise.services [at] kalycito.com