Project

General

Profile

About the Project » History » Version 11

Ty Hikaru DAULTON, 10/21/2024 01:39 PM

1 10 Ty Hikaru DAULTON
2
3
4
5 9 Ty Hikaru DAULTON
___
6
7 11 Ty Hikaru DAULTON
%{color:blue}[[/|Home]]% | [[About Us]] | [[About the Project]] | [[UML Diagrams]] | [[Codes]] | [[Outcome]]
8 9 Ty Hikaru DAULTON
9
10 6 Ty Hikaru DAULTON
___
11 1 Ty Hikaru DAULTON
12 10 Ty Hikaru DAULTON
h1=. <pre>  About the Project
13 6 Ty Hikaru DAULTON
</pre>
14 1 Ty Hikaru DAULTON
15
16
17 4 Ty Hikaru DAULTON
Our project aims to deliver a final product that combines images from two separate projectors into a single seamless image. These two images are processed using advanced image processing techniques, including gamma correction, alpha blending, and modified intensity, to ensure the desired final appearance. Our team consists of a project manager, a leader, and sub-teams dedicated to **%{color: blue}doxygen%** generation, wiki management, coding, commenting, and **%{color: red}UML%** design.
18 1 Ty Hikaru DAULTON
19 2 Ty Hikaru DAULTON
We primarily rely on **%{color: green}OpenCV%** and **%{color: purple}Python%** for our image processing.
20
21 4 Ty Hikaru DAULTON
h2. **Key Aspects of the Project:**
22
23 1 Ty Hikaru DAULTON
- **Image Processing with %{color: purple}Python% and %{color: green}OpenCV%**: We use %{color: purple}Python% in combination with **%{color: green}OpenCV%**, a comprehensive image processing library, to handle complex image analysis and processing tasks efficiently.
24
  
25
- **Structured Design with %{color: red}UML%**: We apply **%{color: red}Unified Modeling Language% (%{color: red}UML%)** to create a clear and structured design for our project. **%{color: red}UML%** allows us to visually represent the system's architecture and workflows, making the design easy to understand and follow.
26
  
27
- **Thorough Documentation with %{color: blue}Doxygen%**: Our code is meticulously documented using **%{color: blue}Doxygen%**, ensuring that it is clear, maintainable, and adaptable for future use.
28
29 4 Ty Hikaru DAULTON
- **Project Management with Redmine**: We use Redmine to manage and track project progress, coordinate tasks, and facilitate team collaboration. This tool helps keep the project organized and on schedule.
30
31
32
33
34
table{width: 100%}.
35
|={width: 30%; background-color: #aaf;}. **Synopsis of Technology** |
36
37
!2Projector1Image.png!
38
39
Our project's **objective** is to produce a single, large, and **distinct image** on a flat screen using **two projectors**. The setup involves a flat screen and **two laptops**, with the projectors directly aimed at the screen. To improve **image quality**, we utilize techniques such as **alpha blending** and **gamma correction**.
40
41
Assuming the screen width is **1280mm**, the two projectors are placed at a distance referred to as ' **d** ', which is less than the screen's width. To calculate the size of the **overlap area** between the images from both projectors, we use the formula ' **screen size - d = x** '. This formula helps clarify the relationship between the **screen size**, the **distance between the projectors**, and the size of the **overlapping area**.
42 1 Ty Hikaru DAULTON
43 8 Ty Hikaru DAULTON
44
45
table{width: 100%}.
46
|={width: 30%; background-color: #aaf;}. **Gamma Correction Method** |
47
48
!gamma_correction_example.png!
49
50 4 Ty Hikaru DAULTON
This method applies **gamma correction** to an image to adjust its **luminance**. Gamma correction is particularly useful for correcting the **brightness** of an image or adjusting its **contrast**. It is a **nonlinear adaptation** applied to each pixel value. In contrast to linear methods like adding, subtracting, or multiplying, which are applied uniformly across all pixels, gamma correction modifies the **saturation** of the image using nonlinear techniques. It's important to maintain a stable **gamma value**, avoiding values that are too small or too large.
51 5 Ty Hikaru DAULTON
52
gamma_corrected = (image / 255.0)^gamma * 255
53 4 Ty Hikaru DAULTON
54 5 Ty Hikaru DAULTON
55 1 Ty Hikaru DAULTON
Note: **gamma** is the provided gamma value. The original **image** is first **normalized** (divided by 255, since pixel values range from 0 to 255), raised to the power of gamma, and then rescaled to the **0-255 range**.
56
57 5 Ty Hikaru DAULTON
table{width: 100%}.
58 8 Ty Hikaru DAULTON
|={width: 30%; background-color: #aaf;}. **Alpha Blending Method** |
59 1 Ty Hikaru DAULTON
60
!AlphaBlending.png!
61 10 Ty Hikaru DAULTON
62
63
This is a **technique** employed in **computer graphics** when layering graphics, where single or multiple objects contain a level of **transparency**. It ensures that the **visible pixels** of the graphic underneath a transparent area have their **color** or **brightness** adjusted based on the transparency level of the upper object. An **alpha channel** acts as a mask that controls how much of the **lower-lying graphics** is displayed.
64
65
The method performs **alpha blending** particularly on the **edges** of an image. It involves combining an image with a background to create the appearance of **partial** or **full transparency**. Typically, this is used to blend two images. The blending behavior depends on the value of **image_side**:
66
67
- **If image_side is 1**, the method blends the **left edge** of the image.
68
- **If image_side is 0**, it blends the **right edge** of the image.
69
70
Blending occurs by mixing two types of images: the **gamma-corrected image** and the **current result image**. This is done using an **alpha value** that varies along the mask width (**mask**), resulting in a significant **transition** between the two images.
71
72
table{width: 100%}.
73
|={width: 30%; background-color: #aaf;}. **Intensity Modification** |
74
75
This **method** modifies the **intensity** at the **edges** of an image. Similar to **alpha blending**, the effect changes based on the value of **image_side**:
76
77
- **If image_side is 1**, the intensity is gradually reduced towards the **left edge**.
78
- **If image_side is 0**, the intensity is gradually reduced towards the **right edge**.
79
80
The **intensity factor** is computed to decrease **linearly** as it approaches the edge of the **mask**. This creates a **fading effect** at the edges, where the image gradually fades into the **background** or into another image.