Project

General

Profile

About our project » History » Version 23

Anderson PHILLIP, 11/08/2025 01:35 AM

1 9 Anderson PHILLIP
[[Wiki|← Back to Start Page]]
2
3 8 Anderson PHILLIP
h1. Project Details
4 1 Jordan Keiwein LAY
5 10 Jordan Keiwein LAY
---
6
7 1 Jordan Keiwein LAY
h2. Overview
8
9 2 HNIN Ei Shwe Yee
The goal of this project is to create a unified display by merging two projector images into one.  
10
To achieve this, we employed techniques such as *gamma correction*, *alpha blending*, and *intensity adjustments* to refine the final output.
11
12
The project team is organized into several roles:
13
* Project Manager
14
* Team Leader
15
* Doxygen Documentation Team
16
* Wiki Team
17
* Coding Team
18
* UML Design Team
19 1 Jordan Keiwein LAY
20 10 Jordan Keiwein LAY
---
21 1 Jordan Keiwein LAY
22 7 HNIN Ei Shwe Yee
h2. Software and Tools
23 1 Jordan Keiwein LAY
24 19 Anderson PHILLIP
|_. Tool |_. Description |
25
| *Redmine* | Used to manage project tasks, track progress, and organize team activities. |
26
| *Python* | Utilized for image processing operations such as blending, gamma correction, and intensity adjustments. |
27
| → **OpenCV** | A *Python library* for computer vision and image processing operations. |
28
| → **numpy** | A *Python library* that supports image processing operations with numerical computation functions. |
29
| → **json** | A built-in *Python library* for reading and writing JSON configuration files used in video and image processing. |
30
| → **tkinter** | A built-in *Python library* for creating the GUI and supporting interactive interface operations. |
31
| *Doxygen* | Employed to generate detailed code documentation automatically. |
32
| *Astah* | Used to design and visualize UML diagrams representing the system’s structure and workflow. |
33
34
35 1 Jordan Keiwein LAY
36 10 Jordan Keiwein LAY
---
37 1 Jordan Keiwein LAY
38 7 HNIN Ei Shwe Yee
h2. Technology
39 1 Jordan Keiwein LAY
40 7 HNIN Ei Shwe Yee
We plan to use a single flat-screen display illuminated by two projectors connected to separate computers, allowing both images to be projected simultaneously and combined into one seamless image.
41 1 Jordan Keiwein LAY
42 7 HNIN Ei Shwe Yee
To accomplish this, we will apply:
43
* Gamma correction method
44
* Alpha blending
45
* Intensity modification
46 1 Jordan Keiwein LAY
47 7 HNIN Ei Shwe Yee
p=. !{width:800px}car.png!
48 1 Jordan Keiwein LAY
49 10 Jordan Keiwein LAY
---
50
51 7 HNIN Ei Shwe Yee
h2. Alpha Blending Method
52 1 Jordan Keiwein LAY
53 7 HNIN Ei Shwe Yee
Alpha blending merges two images using a specific *alpha value* that determines transparency levels.  
54
This process is crucial in rendering and game development for combining visual elements smoothly and reducing jagged edges.
55 1 Jordan Keiwein LAY
56 7 HNIN Ei Shwe Yee
Our program uses:
57
* *Tkinter* – for the graphical interface  
58
* *OpenCV* – for image manipulation  
59
* *PIL* – for image rendering  
60 1 Jordan Keiwein LAY
61 7 HNIN Ei Shwe Yee
Users can load configuration files, apply gamma correction and alpha blending, and view/save the processed images.
62 20 Anderson PHILLIP
63
p=. !blendingexmaple.jpg!
64
65
h3. Interactive Alpha Blending Simulator
66
67
To visualize how *alpha blending* and *gamma correction* affect overlapping regions, we developed an interactive simulator.  
68
This tool allows users to modify blending parameters (Alpha A/B, Gamma, and Color) and observe real-time blending behavior between two projected regions.
69
70
*Access the simulator here:*  
71
"Alpha Blending Simulator":https://gilwebsite.pythonanywhere.com
72
73 22 Anderson PHILLIP
p=. !{width:800px}simulator.png!
74
75 20 Anderson PHILLIP
_This simulator was developed by a team member as a supporting tool to demonstrate the blending principle used in our project._
76
77 1 Jordan Keiwein LAY
78 7 HNIN Ei Shwe Yee
External reference:  
79 23 Anderson PHILLIP
[https://takinginitiative.net/2010/04/09/directx-10-tutorial-6-transparency-and-alpha-blending/]
80 1 Jordan Keiwein LAY
81 10 Jordan Keiwein LAY
---
82 7 HNIN Ei Shwe Yee
83
h2. Gamma Correction Method
84
85 1 Jordan Keiwein LAY
This project uses a technique known as *gamma correction* , which modifies an image’s brightness by adjusting the luminance of each pixel in a *non-linear* way. This method aligns image brightness with *human visual perception* , as the human eye is more sensitive to variations in darker tones than in brighter ones. This biological adaptation allows us to see effectively across a wide range of lighting conditions. Gamma correction applies a *power-law transformation* to pixel values, as represented by the following equation:
86 11 Anderson PHILLIP
87
p=. !{width:800px}equation.png!
88 7 HNIN Ei Shwe Yee
89
When the gamma value (γ) is greater than 1, the image appears darker; when it is less than 1, the image appears brighter.
90
Note: Since pixel intensity values range between 0 and 255, the process involves first normalizing the image (dividing by 255), then applying the power function using the specified gamma value, and finally scaling the result back to the original 0–255 range.
91
92
*[Case1: γ=1.0(Before Gamma Correction)]*
93
94
p=. !{width:800px}before.png!
95
External reference:  
96
[http://rs.aoyaman.com/seminar/about10.html]
97
98
*[Case2: γ=0.8(After Gamma Correction)]*
99
100
p=. !{width:800px}after.png!
101
External reference:  
102
[http://rs.aoyaman.com/seminar/about10.html]
103
104
*[Case3: γ=2.0(After Gamma Correction)]*
105
106
p=. !{width:800px}after2.png!
107
External reference:
108
[http://rs.aoyaman.com/seminar/about10.html]
109 1 Jordan Keiwein LAY
110 10 Jordan Keiwein LAY
---
111
112 7 HNIN Ei Shwe Yee
h2. Intensity Modification
113
114
This method adjusts brightness at image edges using gamma correction and alpha blending depending on *image_side* value:
115
* image_side = 1 → intensity decreases toward the left edge  
116
* image_side = 0 → intensity decreases toward the right edge  
117 1 Jordan Keiwein LAY
By combining gamma correction and alpha blending, the method produces a smooth fading effect along the edges, allowing the image to blend seamlessly with the background or another image.
118 10 Jordan Keiwein LAY
119
---
120 7 HNIN Ei Shwe Yee
121
h2. Transparency
122
123
h3. Gradient Mask Creation
124
125
* The *create_mask()* method (in *MaskCreator* class) generates a smooth transparency gradient using the *smoothstep* function.
126
* This gradient determines the transparency level across the masked region, providing a non-linear transition from fully transparent to fully opaque (or vice versa).
127
* The smoothstep function is used to ensure a gradual and visually smooth change in opacity, minimizing harsh edges.
128
129
h3. Transparency Factor
130
131
* The *transparency_factor* parameter scales the gradient values, directly influencing overall transparency.
132
133
134
* Lower values = more transparent; values closer to 1 = higher opacity.
135
136
h3. Alpha Blending
137
138
* The *alpha_blending()* method applies the alpha gradient to the image within a defined *Region of Interest (ROI)*.
139
* Blending follows the equation:
140
141
p=. *blended = (α × ROI) + (1 − α) × background*
142
143
where:
144
(α × ROI): applies the transparency gradient to the image
145
(1 − α) × background : ensures a gradual transition into the black background.
146
147
h3. Transparency Side Control
148
149
* The direction of the gradient depends on *image_side*:
150
  *Right side* → gradient applied as-is
151
  *Left side* → gradient reversed
152
153
By combining the smoothstep gradient, transparency factor scaling, and alpha blending, this method enables precise control of transparency on the desired side of
154
the image, producing a seamless blending effect.
155 12 Jordan Keiwein LAY
156
---
157
158
h2. Limitations and Possible Enhancements (WIP)
159
160
h3. Limitations
161
162
* The current prototype mainly works with static images, so live video blending or projected alignment adjustments can be limited.
163
* Edge blending only supports left or right sides. Vertical or diagonal blending is not available, which limits flexibility in projector setups.
164
* The current prototype does not support automated calibration or geometric correction, so all alignment between two projectors must be manually handled.
165
* The current prototype expects identical projector usage, so any mismatch in projector properties (birghtness, contrast, color tone) can result in visible seams or image inconsistencies.
166 13 Jordan Keiwein LAY
* The current prototype uses fixed input file names, requiring it to be manually changed.
167 12 Jordan Keiwein LAY
168
h3. Possible Enhancements
169
170
* Implement the ability to adjust parameters like gamma, blend width, and gradient method dynamically in the GUI without restarting the program
171
* Support for multiple video codex, resolutions, and output formats
172
* Expand edge blending to include top, bottom, and diagonal sides