Project

General

Profile

About our project » History » Version 18

Great Gilbert Cansancio SOCO, 11/06/2025 03:15 PM

1 9 Anderson PHILLIP
[[Wiki|← Back to Start Page]]
2
3 8 Anderson PHILLIP
h1. Project Details
4 1 Jordan Keiwein LAY
5 10 Jordan Keiwein LAY
---
6
7 1 Jordan Keiwein LAY
h2. Overview
8
9 2 HNIN Ei Shwe Yee
The goal of this project is to create a unified display by merging two projector images into one.  
10
To achieve this, we employed techniques such as *gamma correction*, *alpha blending*, and *intensity adjustments* to refine the final output.
11
12
The project team is organized into several roles:
13
* Project Manager
14
* Team Leader
15
* Doxygen Documentation Team
16
* Wiki Team
17
* Coding Team
18
* UML Design Team
19 1 Jordan Keiwein LAY
20 10 Jordan Keiwein LAY
---
21 1 Jordan Keiwein LAY
22 7 HNIN Ei Shwe Yee
h2. Software and Tools
23 1 Jordan Keiwein LAY
24 7 HNIN Ei Shwe Yee
* *Redmine* – Used to manage project tasks, track progress, and organize team activities.
25 17 Great Gilbert Cansancio SOCO
* *Python* – Utilized for image processing operations such as blending, gamma correction, and intensity adjustments.
26
** *OpenCV* - handles computer vision and image processing operations.
27
** *numpy* - supports image processing operations with numerical computation functions. 
28
** *json* - holds information read by the json reader (config_reader.py) and used to configure the video processing and image processing elements of Main.
29 18 Great Gilbert Cansancio SOCO
** *tkinter* - uses to create the GUI and support GUI operations.
30 7 HNIN Ei Shwe Yee
* *Doxygen* – Employed to generate detailed code documentation automatically.
31
* *Astah* – Used to design and visualize UML diagrams representing the system’s structure and workflow.
32 1 Jordan Keiwein LAY
33 10 Jordan Keiwein LAY
---
34 1 Jordan Keiwein LAY
35 7 HNIN Ei Shwe Yee
h2. Technology
36 1 Jordan Keiwein LAY
37 7 HNIN Ei Shwe Yee
We plan to use a single flat-screen display illuminated by two projectors connected to separate computers, allowing both images to be projected simultaneously and combined into one seamless image.
38 1 Jordan Keiwein LAY
39 7 HNIN Ei Shwe Yee
To accomplish this, we will apply:
40
* Gamma correction method
41
* Alpha blending
42
* Intensity modification
43 1 Jordan Keiwein LAY
44 7 HNIN Ei Shwe Yee
p=. !{width:800px}car.png!
45 1 Jordan Keiwein LAY
46 10 Jordan Keiwein LAY
---
47
48 7 HNIN Ei Shwe Yee
h2. Alpha Blending Method
49 1 Jordan Keiwein LAY
50 7 HNIN Ei Shwe Yee
Alpha blending merges two images using a specific *alpha value* that determines transparency levels.  
51
This process is crucial in rendering and game development for combining visual elements smoothly and reducing jagged edges.
52 1 Jordan Keiwein LAY
53 7 HNIN Ei Shwe Yee
Our program uses:
54
* *Tkinter* – for the graphical interface  
55
* *OpenCV* – for image manipulation  
56
* *PIL* – for image rendering  
57 1 Jordan Keiwein LAY
58 7 HNIN Ei Shwe Yee
Users can load configuration files, apply gamma correction and alpha blending, and view/save the processed images.
59 11 Anderson PHILLIP
p=.!blendingexmaple.jpg!
60 1 Jordan Keiwein LAY
61 7 HNIN Ei Shwe Yee
External reference:  
62
[https://takinginitiative.net/2010/04/09/directx-10-tutorial-6-transparency-and-alpha-blending/]
63 1 Jordan Keiwein LAY
64 10 Jordan Keiwein LAY
---
65 7 HNIN Ei Shwe Yee
66
h2. Gamma Correction Method
67
68 1 Jordan Keiwein LAY
This project uses a technique known as *gamma correction* , which modifies an image’s brightness by adjusting the luminance of each pixel in a *non-linear* way. This method aligns image brightness with *human visual perception* , as the human eye is more sensitive to variations in darker tones than in brighter ones. This biological adaptation allows us to see effectively across a wide range of lighting conditions. Gamma correction applies a *power-law transformation* to pixel values, as represented by the following equation:
69 11 Anderson PHILLIP
70
p=. !{width:800px}equation.png!
71 7 HNIN Ei Shwe Yee
72
When the gamma value (γ) is greater than 1, the image appears darker; when it is less than 1, the image appears brighter.
73
Note: Since pixel intensity values range between 0 and 255, the process involves first normalizing the image (dividing by 255), then applying the power function using the specified gamma value, and finally scaling the result back to the original 0–255 range.
74
75
*[Case1: γ=1.0(Before Gamma Correction)]*
76
77
p=. !{width:800px}before.png!
78
External reference:  
79
[http://rs.aoyaman.com/seminar/about10.html]
80
81
*[Case2: γ=0.8(After Gamma Correction)]*
82
83
p=. !{width:800px}after.png!
84
External reference:  
85
[http://rs.aoyaman.com/seminar/about10.html]
86
87
*[Case3: γ=2.0(After Gamma Correction)]*
88
89
p=. !{width:800px}after2.png!
90
External reference:
91
[http://rs.aoyaman.com/seminar/about10.html]
92 1 Jordan Keiwein LAY
93 10 Jordan Keiwein LAY
---
94
95 7 HNIN Ei Shwe Yee
h2. Intensity Modification
96
97
This method adjusts brightness at image edges using gamma correction and alpha blending depending on *image_side* value:
98
* image_side = 1 → intensity decreases toward the left edge  
99
* image_side = 0 → intensity decreases toward the right edge  
100 1 Jordan Keiwein LAY
By combining gamma correction and alpha blending, the method produces a smooth fading effect along the edges, allowing the image to blend seamlessly with the background or another image.
101 10 Jordan Keiwein LAY
102
---
103 7 HNIN Ei Shwe Yee
104
h2. Transparency
105
106
h3. Gradient Mask Creation
107
108
* The *create_mask()* method (in *MaskCreator* class) generates a smooth transparency gradient using the *smoothstep* function.
109
* This gradient determines the transparency level across the masked region, providing a non-linear transition from fully transparent to fully opaque (or vice versa).
110
* The smoothstep function is used to ensure a gradual and visually smooth change in opacity, minimizing harsh edges.
111
112
h3. Transparency Factor
113
114
* The *transparency_factor* parameter scales the gradient values, directly influencing overall transparency.
115
116
117
* Lower values = more transparent; values closer to 1 = higher opacity.
118
119
h3. Alpha Blending
120
121
* The *alpha_blending()* method applies the alpha gradient to the image within a defined *Region of Interest (ROI)*.
122
* Blending follows the equation:
123
124
p=. *blended = (α × ROI) + (1 − α) × background*
125
126
where:
127
(α × ROI): applies the transparency gradient to the image
128
(1 − α) × background : ensures a gradual transition into the black background.
129
130
h3. Transparency Side Control
131
132
* The direction of the gradient depends on *image_side*:
133
  *Right side* → gradient applied as-is
134
  *Left side* → gradient reversed
135
136
By combining the smoothstep gradient, transparency factor scaling, and alpha blending, this method enables precise control of transparency on the desired side of
137
the image, producing a seamless blending effect.
138 12 Jordan Keiwein LAY
139
---
140
141
h2. Limitations and Possible Enhancements (WIP)
142
143
h3. Limitations
144
145
* The current prototype mainly works with static images, so live video blending or projected alignment adjustments can be limited.
146
* Edge blending only supports left or right sides. Vertical or diagonal blending is not available, which limits flexibility in projector setups.
147
* The current prototype does not support automated calibration or geometric correction, so all alignment between two projectors must be manually handled.
148
* The current prototype expects identical projector usage, so any mismatch in projector properties (birghtness, contrast, color tone) can result in visible seams or image inconsistencies.
149 13 Jordan Keiwein LAY
* The current prototype uses fixed input file names, requiring it to be manually changed.
150 12 Jordan Keiwein LAY
151
h3. Possible Enhancements
152
153
* Implement the ability to adjust parameters like gamma, blend width, and gradient method dynamically in the GUI without restarting the program
154
* Support for multiple video codex, resolutions, and output formats
155
* Expand edge blending to include top, bottom, and diagonal sides