@@ -60,6 +60,11 @@ We use two blocks to process the streams but that doesnot mean we use one thread

## Instructions to run
Simply clone the repository in a Pynq Z2 board and run the file "Hyperspectral_Image_Filter_FPGA_CPU_comparison.ipynb" line by line.
The code should generate a output file that highlights the rough edges of the image for example contours / hills.
## What we achieved and the caveat :
<b>*We intended to build a architechture that can process multiple streams and process them in same parallel level and we were sucessful.*</b>
...
...
@@ -76,7 +81,7 @@ It is not very suitable for image processing tasks as arrays stored in memory do
<b>*The image processing can serve a stepping stone for controlling multi-agent systems. Where each streaming interface can be used for instruction input and output for each agent/bots.*</b>
*We achieved good synchronization betwenn the input streams in terms of pixel processing. We can consider the real world environment as a array of pixels with each pixel representing the coordinates of each bot. In this scenario we can process all inputs (pixels) from each bots and implement collison avoidance and basic navigation using same architechture.*
*We achieved good synchronization betwenn the input streams in terms of pixel processing. We can consider extending the filter to video with streaming, it might be possible with similar kind of streaming interface.*