diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/.pages b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/.pages
index 9cb01955bdf..ee9aa76a43e 100644
--- a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/.pages
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/.pages
@@ -1,3 +1,3 @@
nav:
- - evaluating-controller-performance.md
- - evaluating-real-time-performance.md
+ - Sample tuning for campus environment: tuning-parameters
+ - Evaluation: evaluation
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/.pages b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/.pages
new file mode 100644
index 00000000000..2cd5a51e38e
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/.pages
@@ -0,0 +1,3 @@
+nav:
+ - Evaluating the controller performance: evaluating-controller-performance
+ - Evaluating real-time performance: evaluating-real-time-performance
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/controller-monitor.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/controller-monitor.png
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/controller-monitor.png
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/controller-monitor.png
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/export-cvs.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/export-cvs.png
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/export-cvs.png
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/export-cvs.png
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/import-data.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/import-data.png
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/import-data.png
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/import-data.png
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/plot-xy.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/plot-xy.png
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/plot-xy.png
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/plot-xy.png
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/start-plotjuggler.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/start-plotjuggler.png
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/images/evaluating-controller-performance/start-plotjuggler.png
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/images/evaluating-controller-performance/start-plotjuggler.png
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluating-controller-performance.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/index.md
similarity index 98%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluating-controller-performance.md
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/index.md
index 50989f798f2..096a520e587 100644
--- a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluating-controller-performance.md
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-controller-performance/index.md
@@ -15,7 +15,7 @@ If you need more detailed information about package, refer to the [control_perfo
#### 2. Initialize the vehicle and send goal position to create route
-- If you have any problem with launching Autoware, please see the [tutorials](../../../tutorials/index.md) page.
+- If you have any problem with launching Autoware, please see the [tutorials](../../../../../tutorials/index.md) page.
#### 3. Launch the control_performance_analysis package
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluating-real-time-performance.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-real-time-performance/index.md
similarity index 100%
rename from docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluating-real-time-performance.md
rename to docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/evaluation/evaluating-real-time-performance/index.md
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/.pages b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/.pages
new file mode 100644
index 00000000000..e4c5615857f
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/.pages
@@ -0,0 +1,5 @@
+nav:
+ - index.md
+ - Tuning localization: localization-tuning
+ - Tuning perception: perception-tuning
+ - Tuning planning: planning-tuning
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-lanelet2-map.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-lanelet2-map.png
new file mode 100644
index 00000000000..2a796169fab
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-lanelet2-map.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-pcd-map.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-pcd-map.png
new file mode 100644
index 00000000000..5decb28ddd1
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus-pcd-map.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus.png
new file mode 100644
index 00000000000..0d927d9d2c8
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/images/ytu-campus.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/index.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/index.md
new file mode 100644
index 00000000000..4c361c2247d
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/index.md
@@ -0,0 +1,72 @@
+# Sample tuning for campus environment
+
+## Introduction
+
+This section will guide you on how to fine-tune Autoware's localization, perception,
+and planning stacks for the Yıldız Technical University (YTU) campus environment.
+This document serves as your roadmap for optimizing the performance of
+Autoware within the cutting-edge capabilities of a sample university campus environment.
+Through meticulous parameter adjustments, our aim is to ensure that your
+autonomous systems operate seamlessly and efficiently in similar complex real-world scenarios.
+
+## Yıldız Technical University Campus Environment
+
+[Yıldız Technical University (YTU)](https://yildiz.edu.tr/en) is located in Istanbul, Turkey,
+and it encompasses multiple campuses. One of these campuses is the
+Davutpaşa Campus, where we operate autonomous vehicles. Here is some
+general information about the YTU Davutpaşa Campus:
+
+1. **Slopes and Terrains**
+
+ YTU Davutpaşa Campus has varied topography.
+ Some areas may be flat, while others may have gentle or steep slopes.
+ These features can influence accessibility and landscaping choices.
+
+2. **Greenery and Trees**
+
+ YTU Davutpaşa Campus is typically landscaped with a variety of plants and trees. They provide aesthetic
+ appeal, shade, and contribute to the overall environment. There may be designated green spaces,
+ gardens, and courtyards.
+
+3. **Building structures**
+
+ The Davutpaşa Campus features a diverse range of buildings, ranging in size from small-scale structures
+ to more substantial edifices. These buildings will be utilized for NDT localization.
+
+
+
+## Yıldız Technical University Campus Map
+
+### Yıldız Technical University Campus Pointcloud Map
+
+Utilizing the LIO-SAM mapping package, we have effectively created a point cloud map of Yıldız
+Technical University's Davutpaşa campus. This real-time lidar-inertial odometry system has empowered
+us to achieve precision in this unique campus environment. For detailed information on how we
+constructed the point cloud map at YTU Davutpaşa, please refer to the
+[LIO-SAM page](../../creating-maps/open-source-slam/lio-sam). Additionally,
+we have converted the output map into MGRS format.
+
+
+
+### Yıldız Technical University Campus Lanelet2 Map
+
+We have generated a Lanelet2 HD map, incorporating regulatory elements like crosswalks,
+speed bumps, stop lines etc., tailored for the YTU campus environment. For more detailed
+information on the Lanelet2 map creation process, please refer to the [`Creating Maps` page](../../creating-maps).
+
+
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-150m.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-150m.png
new file mode 100644
index 00000000000..222a57d0563
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-150m.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-60m.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-60m.png
new file mode 100644
index 00000000000..7fcd599ea88
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/ndt-range-60m.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/pcd-range-cloud-compare.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/pcd-range-cloud-compare.png
new file mode 100644
index 00000000000..ebf927249a2
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/pcd-range-cloud-compare.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-1.0.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-1.0.png
new file mode 100644
index 00000000000..615577b7899
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-1.0.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-3.0.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-3.0.png
new file mode 100644
index 00000000000..b5c0d5e14e9
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/images/voxel-size-3.0.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/index.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/index.md
new file mode 100644
index 00000000000..6dc17ba0eec
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/localization-tuning/index.md
@@ -0,0 +1,148 @@
+# Tuning localization
+
+## Introduction
+
+In this section,
+our focus will be on refining localization accuracy within the YTU Campus environment through updates to localization parameters and methods.
+Our approach involves
+utilizing NDT as the pose input source and the Gyro Odometer as the twist input source.
+These adjustments play a pivotal role
+in achieving a heightened level of precision and reliability in our localization processes,
+ensuring optimal performance in the specific conditions of the YTU campus.
+
+## NDT parameter tuning
+
+### Crop-box filter for localization input
+
+- In our campus environment, certain areas can be challenging for NDT localization,
+ particularly those near cliffs or wooded areas that are far from buildings.
+
+
+
+- In these areas,
+ the default NDT range
+ (which involves cropping the NDT input point cloud at the localization utility point cloud pipeline)
+ may prove insufficient for aligning point clouds.
+ The default NDT input point cloud parameters are shown below:
+
+!!! note "The default [crop_box_filter_measurement_range.param.yaml](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/localization/crop_box_filter_measurement_range.param.yaml) file for localization util."
+
+ ```yaml
+ /**:
+ ros__parameters:
+ input_frame: "base_link"
+ output_frame: "base_link"
+ min_x: -60.0
+ max_x: 60.0
+ min_y: -60.0
+ max_y: 60.0
+ min_z: -30.0
+ max_z: 50.0
+ negative: False
+ ```
+
+- The green points (topic name: `/localization/pose_estimator/points_aligned`)
+ represent the NDT localization aligned points on the map in the image below.
+ The default range is 60 meters, meaning points beyond this distance cannot be utilized.
+
+
+
+- If we wish to increase our NDT input point cloud range,
+ we can make the following changes in the `crop_box_filter_measurement_range.param.yaml` file.
+ However,
+ please bear in mind that since this alteration enlarges the size of the NDT input point cloud,
+ it will require additional resources on your processor.
+
+!!! note "[`crop_box_filter_measurement_range.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/localization/crop_box_filter_measurement_range.param.yaml) parameter file:"
+
+ ```diff
+ /**:
+ ros__parameters:
+ input_frame: "base_link"
+ output_frame: "base_link"
+ - min_x: -60.0
+ + min_x: -150.0
+ - max_x: 60.0
+ + max_x: 150.0
+ - min_y: -60.0
+ + min_y: -150.0
+ - max_y: 60.0
+ + max_y: 150.0
+ min_z: -30.0
+ max_z: 50.0
+ negative: False
+ ```
+
+
+
+### Voxel-grid filter for localization input
+
+- Voxel Grid filtering is a technique used in point cloud pre-processing
+ to reduce the density of 3D point cloud data while preserving its overall structure.
+ This is especially useful in scenarios
+ where high-resolution point clouds are computationally expensive
+ to process or unnecessary for the task at hand.
+ The default voxel size for all three axes in Autoware is 3.0.
+ If you have additional computational resources,
+ reducing the voxel size can enhance localization accuracy.
+ However, please be aware that this will demand more computational power.
+
+!!! note " The default [voxel_grid_filter.param.yaml](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/localization/voxel_grid_filter.param.yaml) file for localization util."
+
+ ```yaml
+ /**:
+ ros__parameters:
+ voxel_size_x: 3.0
+ voxel_size_y: 3.0
+ voxel_size_z: 3.0
+ ```
+
+- The default voxel size for downsampling is 3.0,
+ and the resulting aligned points will resemble the image below.
+
+
+
+- We have sufficient computational power available on our tutorial vehicle,
+ so we will reduce the voxel size to improve localization accuracy.
+ Feel free to experiment with tuning the voxel size for your own computer setup.
+
+!!! note "[voxel_grid_filter.param.yaml](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/localization/voxel_grid_filter.param.yaml) parameter file:"
+
+ ```diff
+
+ /**:
+ ros__parameters:
+ - voxel_size_x: 3.0
+ + voxel_size_x: 1.0
+ - voxel_size_y: 3.0
+ + voxel_size_y: 1.0
+ - voxel_size_z: 3.0
+ + voxel_size_z: 1.0
+ ```
+
+
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-clustering.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-clustering.png
new file mode 100644
index 00000000000..02c08f180ec
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-clustering.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-ground-segmentation.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-ground-segmentation.png
new file mode 100644
index 00000000000..31c364475ef
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/after-tuning-ground-segmentation.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/ground-remover-ghost-points.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/ground-remover-ghost-points.png
new file mode 100644
index 00000000000..132b0595603
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/ground-remover-ghost-points.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/initial-clusters.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/initial-clusters.png
new file mode 100644
index 00000000000..e60fe26e38e
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/images/initial-clusters.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/index.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/index.md
new file mode 100644
index 00000000000..df7a4e3782e
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/perception-tuning/index.md
@@ -0,0 +1,247 @@
+# Tuning perception
+
+## Introduction
+
+In this section, we plan to enhance our perception accuracy within the YTU Campus environment
+by updating some parameters and methods.
+We will enable camera-lidar
+fusion as our chosen perception method. This approach holds the potential to significantly
+improve our ability to accurately perceive and understand the surroundings, enabling our vehicles
+to navigate more effectively and safely within the campus premises. By fine-tuning these perception
+parameters, we aim to advance the capabilities of our systems and further optimize their performance
+in this specific environment.
+
+## Perception parameter tuning
+
+### Enabling camera-lidar fusion
+
+- To enable camera-lidar fusion, you need to first calibrate both your camera and lidar.
+ Following that, you will need to utilize the `image_info`
+ and `rectified_image` topics in order to employ the `tensorrt_yolo` node.
+ Once these ROS 2 topics are prepared,
+ we can proceed with enabling camera-lidar fusion as our chosen perception method:
+
+!!! note "Enabling camera lidar fusion on [`autoware.launch.xml`](https://github.com/autowarefoundation/autoware_launch/blob/2255356e0164430ed5bc7dd325e3b61e983567a3/autoware_launch/launch/autoware.launch.xml#L42)"
+
+ ```diff
+ -
+ +
+ ```
+
+After that,
+we need
+to run the [TensorRT YOLO node](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/tensorrt_yolo) for our camera topics
+if it hasn't been launched on your sensor model.
+You can launch the tensorrt_yolo nodes by uncommenting the following lines in the [`camera_lidar_fusion_based_detection.launch.xml`](https://github.com/autowarefoundation/autoware.universe/blob/main/launch/tier4_perception_launch/launch/object_recognition/detection/camera_lidar_fusion_based_detection.launch.xml)
+file:
+
+!!! note "Please adjust the following lines in the `camera_lidar_fusion_based_detection.launch.xml` file based on the number of your cameras (image_number)"
+
+ ```xml
+
+
+
+
+ ...
+ ```
+
+- Also, you need to update the roi_sync.param.yaml parameter file according to your camera number.
+ Firstly,
+ please refer to the roi_cluster_fusion documentation for more information about this package.
+ Then, you will update your camera offsets.
+ For example,
+ if you have four cameras for the perception detection pipeline,
+ and you haven't measured their timestamps,
+ you can set these camera offsets to "0" as the initial value.
+ Please be careful with the offset array size; it must be equal to your camera count.
+
+!!! note "[roi_sync.param.yaml](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/object_recognition/detection/image_projection_based_fusion/roi_sync.param.yaml) parameter file:"
+
+ ```diff
+ - input_offset_ms: [61.67, 111.67, 45.0, 28.33, 78.33, 95.0] # 6 cameras
+ + input_offset_ms: [0.0, 0.0, 0.0, 0.0] # 4 cameras
+ ```
+
+- If you have used different namespaces for your camera and ROI topics,
+ you will need to add the input topics for camera_info,
+ image_raw,
+ and rois messages in the `tier4_perception_component.launch.xml` launch file.
+
+!!! note "[`tier4_perception_component.launch.xml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/launch/components/tier4_perception_component.launch.xml) launch file:"
+
+ ```diff
+ -
+ +
+ -
+ +
+ -
+ +
+ ```
+
+### Tuning ground segmentation
+
+- The ground segmentation package removes the ground points from the input point cloud for the perception pipeline.
+ In our campus environment, there are a lot of high slopes and rough roads.
+ Therefore, this condition makes it difficult to accurately segment ground and non-ground points.
+
+- For example, when we pass over speed bumps,
+ there are a lot of false positives (ghost points) that appear as non-ground points,
+ as shown in the image below.
+
+
+
+- These ghost points affect the motion planner of Autoware,
+ causing the vehicle to stop even though there is no obstacle on the road during autonomous driving.
+ We will reduce the number of false positive non-ground points
+ by fine-tuning the ground segmentation in Autoware.
+
+- There are three different ground segmentation algorithms included in Autoware:
+ `ray_ground_filter`, `scan_ground_filter`, and `ransac_ground_filter`.
+ The default method is the `scan_ground_filter`.
+ Please refer to the [`ground_segmentation` package documentation](https://autowarefoundation.github.io/autoware.universe/main/perception/ground_segmentation/)
+ for more information about these methods and their parameter definitions.
+
+- Firstly,
+ we will change the `global_slope_max_angle_deg` value from 10 to 30 degrees at [`ground_segmentation.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/obstacle_segmentation/ground_segmentation/ground_segmentation.param.yaml) parameter file.
+ This change will reduce our false positive non-ground points.
+ However, be cautious when increasing the threshold,
+ as it may lead to an increase in the number of false negatives.
+
+!!! note "[`ground_segmentation.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/obstacle_segmentation/ground_segmentation/ground_segmentation.param.yaml) parameter file:"
+
+ ```diff
+ - global_slope_max_angle_deg: 10.0
+ + global_slope_max_angle_deg: 30.0
+ ```
+
+- Then we will update the split_height_distance parameter from 0.2 to 0.35 meters.
+ This adjustment will help in reducing false positive non-ground points,
+ especially on step-like road surfaces or in cases of misaligned multiple lidar configurations.
+
+!!! note "[`ground_segmentation.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/obstacle_segmentation/ground_segmentation/ground_segmentation.param.yaml) parameter file:"
+
+ ```diff
+ - split_height_distance: 0.2
+ + split_height_distance: 0.35
+ ```
+
+- Now, we will change the non_ground_height_threshold value from 0.2 to 0.3 meters.
+ This will help us in reducing false positive non-ground points,
+ but it may also decrease the number of true positive non-ground points
+ that are below this threshold value.
+
+!!! note "[`ground_segmentation.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/obstacle_segmentation/ground_segmentation/ground_segmentation.param.yaml) parameter file:"
+
+ ```diff
+ - non_ground_height_threshold: 0.2
+ + non_ground_height_threshold: 0.3
+ ```
+
+- The following image illustrates the results after these fine-tunings with the ground remover package.
+
+
+
+- You need to update the ground segmenation according to your environment.
+ These examples are provided for high slopes and rough road conditions.
+ If you have better conditions,
+ you can adjust your parameters
+ by referring to the [`ground_segmentation` package documentation page](https://autowarefoundation.github.io/autoware.universe/main/perception/ground_segmentation/).
+
+### Tuning euclidean clustering
+
+- The `euclidean_clustering` package applies Euclidean clustering methods
+ to cluster points into smaller parts for classifying objects.
+ Please refer to [`euclidean_clustering` package documentation](https://github.com/autowarefoundation/autoware.universe/tree/main/perception/euclidean_cluster) for more information.
+ This package is used in the detection pipeline of Autoware architecture.
+ There are two different euclidean clustering methods included in this package:
+ `euclidean_cluster` and `voxel_grid_based_euclidean_cluster`.
+ In the default design of Autoware,
+ the `voxel_grid_based_euclidean_cluster` method serves as the default Euclidean clustering method.
+
+- In the YTU campus environment, there are many small objects like birds,
+ dogs, cats, balls, cones, etc. To detect, track,
+ and predict these small objects, we aim to assign clusters to them as small as possible.
+
+- Firstly, we will change our object filter method from lanelet_filter to position_filter
+ to detect objects that are outside the lanelet boundaries at the [`tier4_perception_component.launch.xml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/launch/components/tier4_perception_component.launch.xml).
+
+!!! note "[`tier4_perception_component.launch.xml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/launch/components/tier4_perception_component.launch.xml) launch file:"
+
+ ```diff
+ -
+ +
+ ```
+
+- After changing the filter method for objects,
+ the output of our perception pipeline looks like the image below:
+
+
+
+- Now, we can detect unknown objects that are outside the lanelet map,
+ but we still need to update the filter range
+ or disable the filter for unknown objects in the [`object_position_filter.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/object_recognition/detection/object_filter/object_position_filter.param.yaml) file.
+
+!!! note "[`object_position_filter.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/object_recognition/detection/object_filter/object_position_filter.param.yaml) parameter file:"
+
+ ```diff
+ upper_bound_x: 100.0
+ - lower_bound_x: 0.0
+ + lower_bound_x: -100.0
+ - upper_bound_y: 10.0
+ + upper_bound_y: 100.0
+ - lower_bound_y: -10.0
+ + lower_bound_y: -100.0
+ ```
+
+- Also, you can simply disable the filter for unknown labeled objects.
+
+!!! note "[`object_position_filter.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/object_recognition/detection/object_filter/object_position_filter.param.yaml) parameter file:"
+
+ ```diff
+ - UNKNOWN : true
+ + UNKNOWN : false
+ ```
+
+- After that,
+ we can update our clustering parameters
+ since we can detect all objects regardless of filtering objects with the lanelet2 map.
+ As we mentioned earlier, we want to detect small objects.
+ Therefore,
+ we will decrease the minimum cluster size to 1 in the `voxel_grid_based_euclidean_cluster.param.yaml` file.
+
+!!! note "[`voxel_grid_based_euclidean_cluster.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/perception/object_recognition/detection/clustering/voxel_grid_based_euclidean_cluster.param.yaml) parameter file:"
+
+ ```diff
+ - min_cluster_size: 10
+ + min_cluster_size: 1
+ ```
+
+- After making these changes, our perception output is shown in the following image:
+
+
+
+If you want to use an object filter after fine-tuning clusters for unknown objects,
+you can utilize either the lanelet filter or the position filter for unknown objects.
+Please refer to the documentation of the [`detected_object_validation` package page](https://autowarefoundation.github.io/autoware.universe/main/perception/detected_object_validation/) for further information.
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-current-lane.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-current-lane.png
new file mode 100644
index 00000000000..5f2ae537599
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-current-lane.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-lateral-margin.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-lateral-margin.png
new file mode 100644
index 00000000000..91e5e4a58d3
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-lateral-margin.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-opposite-lane.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-opposite-lane.png
new file mode 100644
index 00000000000..2cb6e71d95b
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/avoidance-opposite-lane.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-avoidance-planner.svg b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-avoidance-planner.svg
new file mode 100644
index 00000000000..a9b7c28adae
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-avoidance-planner.svg
@@ -0,0 +1,4 @@
+
+
+
+
\ No newline at end of file
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-stop-planner.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-stop-planner.png
new file mode 100644
index 00000000000..4ea933b1aa7
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/obstacle-stop-planner.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/speed-bump.png b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/speed-bump.png
new file mode 100644
index 00000000000..1c571cefdde
Binary files /dev/null and b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/images/speed-bump.png differ
diff --git a/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/index.md b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/index.md
new file mode 100644
index 00000000000..156d397244a
--- /dev/null
+++ b/docs/how-to-guides/integrating-autoware/tuning-parameters-and-performance/tuning-parameters/planning-tuning/index.md
@@ -0,0 +1,280 @@
+# Tuning planning
+
+In this section, we will update, evaluate, and fine-tune the Autoware planning modules,
+with a specific emphasis on the lane driving modules within the YTU campus environment.
+The Autoware planning side consists of the following main lane driving sections:
+**Behavior planning and motion planning**. Our focus will be on fine-tuning these modules to
+enhance our planning performance in the campus environment.
+
+## Introduction
+
+## Planning parameter tuning
+
+### Behavior planning tuning
+
+#### Behavior velocity planner
+
+The Behavior velocity planner is a planner that adjusts velocity based on traffic rules.
+It loads modules as plugins. Please refer to the package documentation for more
+information about these modules.
+
+To enable or disable Behavior velocity planner modules,
+we will enable or disable the necessary plugin modules in the [`default_preset.yaml` file](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/preset/default_preset.yaml) according to our preferences.
+
+- For example, in the YTU campus environment, there are many speed bumps, pedestrians, and specific areas where autonomous driving is not permitted.
+ We will enable the following three modules to handle these conditions:
+
+ - Speed bump module
+ - Out of lane module
+
+To enable these modules,
+we need to set the launch\_ argument
+to `true` for these modules in the `default_preset.yaml` parameter file:
+
+!!! note "[`default_preset.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/preset/default_preset.yaml) parameter file:"
+
+ ```diff
+ arg:
+ name: launch_speed_bump_module
+ - default: "false"
+ + default: "true"
+ arg:
+ name: launch_out_of_lane_module
+ - default: "false"
+ + default: "true"
+ ```
+
+#### Speed bump module tuning
+
+- Our vehicle's cruising speed is set to `15 km/h`.
+ Therefore,
+ we will decrease the default speed bump velocity limits in the `speed_bump.param.yaml`
+ to allow for minimum and maximum speeds when going over speed bumps.
+
+!!! note "[`speed_bump.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_velocity_planner/speed_bump.param.yaml) parameter file:"
+
+ ```diff
+ - min_speed: 1.39 # [m/s] = [5 kph]
+ + min_speed: 1.11 # [m/s] = [4 kph]
+ - max_speed: 2.78 # [m/s] = [10 kph]
+ + max_speed: 2.22 # [m/s] = [8 kph]
+ ```
+
+- Also, we will increase the slow_start_margin parameter to provide
+ a greater margin for slowing down when approaching a speed bump.
+ Please refer to the speed bump module page for more information
+ about these parameters. It is essential to fine-tune these modules
+ based on your specific vehicle and environmental conditions.
+
+!!! note "[`speed_bump.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_velocity_planner/speed_bump.param.yaml) parameter file:"
+
+ ```diff
+ - slow_start_margin: 1.0 [m]
+ + slow_start_margin: 2.0 [m]
+ ```
+
+- The following image illustrates the virtual wall created by the slow start margin
+ of the speed bump module. If you increase or decrease the `slow_start_margin` parameter,
+ you will observe that the position of the virtual wall changes is relative to the speed bump.
+
+
+
+#### Avoidance
+
+The Avoidance module plays a pivotal role in Autoware's behavior planning scene modules,
+offering rule-based avoidance capabilities. It provides the flexibility to define behavior
+based on intuitive parameters like lateral jerk and avoidance distance margin, enabling
+adaptable avoidance strategies tailored to your specific environment. This module operates
+within lane constraints, requiring access to lane structure information to ensure compliance
+with traffic rules. For example, it triggers indicator signals when the vehicle crosses a lane.
+The key distinction between motion and behavior modules in the planning stack is their consideration
+of traffic rules.
+Please refer to the [`Avoidance Module` page](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_path_planner/docs/behavior_path_planner_avoidance_design/) for more information about the module's capabilities.
+
+We will modify and update certain avoidance rules and margin parameters
+to handle with the specific conditions of our YTU campus environment
+and the capabilities of our vehicle.
+
+- First, we will disable the use of opposite lanelets while avoiding,
+ as we do not want to utilize them in our YTU Campus environment.
+ The default avoidance behavior in our environment is as depicted in the image below.
+
+
+
+- To disable the use of the opposite lanelet in the avoidance, we will modify the value in the `avoidance.param.yaml` file:
+
+!!! note "[`avoidance.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_path_planner/avoidance/avoidance.param.yaml) parameter file:"
+
+ ```diff
+ - use_opposite_lane: true
+ + use_opposite_lane: false
+ ```
+
+Now, we expect the avoidance module not to utilize the opposite lane in our area.
+If the vehicle cannot fit past an obstacle in the same lanelet, it will come to a stop,
+as depicted in the image below:
+
+
+
+Since we disabled the use of opposite lanelets for the Avoidance module,
+there may be instances where we cannot avoid objects due to the shifted path
+(`safety_buffer_lateral` + `avoid_margin_lateral`) not fitting within the available lanelet.
+As a small vehicle, we will reduce the `avoid_margin_lateral` parameter to decrease the distance
+between objects and the ego vehicle.
+You can adjust these changes to any target object according to your preference.
+For more information on these parameters, please refer to
+the [avoidance module documentation](https://autowarefoundation.github.io/autoware.universe/main/planning/behavior_path_planner/docs/behavior_path_planner_avoidance_design/).
+
+!!! note "[`avoidance.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_path_planner/avoidance/avoidance.param.yaml) parameter file:"
+
+ ```diff
+ ...
+ - avoid_margin_lateral: 1.0 # [m]
+ + avoid_margin_lateral: 0.5 # [m]
+ ...
+ ```
+
+
+
+- Also, you can choose which objects can be avoided along the path.
+ In the YTU campus environment, as mentioned earlier, there are many
+ pedestrians, and we do not want to perform avoidance maneuvers for them.
+ Therefore, we will disable the avoidance maneuver for pedestrians by
+ modifying the target value in the [`avoidance.param.yaml` file](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_path_planner/avoidance/avoidance.param.yaml).
+ You can disable any target object according to your preference.
+
+!!! note "[`avoidance.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/behavior_planning/behavior_path_planner/avoidance/avoidance.param.yaml) parameter file:"
+
+ ```diff
+ ...
+ pedestrian:
+ - is_target: true
+ + is_target: false
+ ...
+ ```
+
+### Motion Planning Tuning
+
+#### Obstacle avoidance planner
+
+The obstacle avoidance planner generates a kinematically feasible
+and collision-free trajectory based on the input path and drivable area.
+It updates the trajectory's position and orientation but retains the velocity from the input path.
+Please refer to the [Obstacle Avoidance Planner](https://autowarefoundation.github.io/autoware.universe/main/planning/obstacle_avoidance_planner/)
+and [Model Predictive Trajectory (MPT)](https://autowarefoundation.github.io/autoware.universe/main/planning/obstacle_avoidance_planner/docs/mpt/)
+documentation page for more information about this package.
+
+- The YTU Campus environment features numerous U-turns, narrow roads, and roundabouts.
+ Given our vehicle's maximum steering angle is insufficient for these conditions,
+ we must consider our steering angle limit (defined in the vehicle_info.param.yaml)
+ to navigate these road types without exceeding road boundaries.
+ Therefore, we will enable the steer_limit_constraint parameter for the obstacle avoidance planner:
+
+!!! note "[`obstacle_avoidance_planner.param.yaml`](https://github.com/autowarefoundation/autoware.universe/blob/main/planning/obstacle_avoidance_planner/config/obstacle_avoidance_planner.param.yaml) parameter file:"
+
+ ```diff
+ - steer_limit_constraint: false
+ + steer_limit_constraint: true
+ ```
+
+- Additionally, we will modify our mechanism for checking outside the drivable area.
+ By default, the obstacle avoidance planner checks the four corner points of the MPT footprint.
+ However, this may lead to incorrect information in some situations, as shown in the following image.
+
+
+
+- To address this issue,
+ we will enable the `use_footprint_polygon_for_outside_drivable_area_check` parameter
+ to consider the footprint as a polygon and check if it exceeds the Lanelet2 boundaries.
+
+!!! note "[`obstacle_avoidance_planner.param.yaml`](https://github.com/autowarefoundation/autoware.universe/blob/main/planning/obstacle_avoidance_planner/config/obstacle_avoidance_planner.param.yaml) parameter file:"
+
+ ```diff
+ - use_footprint_polygon_for_outside_drivable_area_check: false # If false, only the footprint's corner points are considered.
+ + use_footprint_polygon_for_outside_drivable_area_check: true # If false, only the footprint's corner points are considered.
+ ```
+
+#### Obstacle stop planner
+
+Autoware implements two motion stop planners:
+the obstacle stop planner and the obstacle cruise planner.
+We use the obstacle stop planner in the campus environment because it considers the input point cloud to insert stop points into the trajectory
+(more safety).
+The obstacle cruise planner, on the other hand, uses dynamic objects instead of the point cloud.
+This provides higher accuracy than estimating the velocity on the planning side.
+
+To determine which stop planner is being used,
+please update the motion_stop_planner_type parameter in the `default_preset.yaml` file:
+
+!!! note "[`default_preset.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/preset/default_preset.yaml) parameter file:"
+
+ ```yaml
+ - arg:
+ name: motion_stop_planner_type
+ default: obstacle_stop_planner
+ # option: obstacle_stop_planner
+ # obstacle_cruise_planner
+ # none
+ ```
+
+In our YTU environment and test vehicle conditions,
+due to the short vehicle width, the trajectory is also short.
+To improve the detection range of obstacle stop planner,
+we will increase the lateral margin for detection area.
+For more information on the parameters and inner working algorithms of the obstacle stop planner,
+please refer to the [documentation page](https://autowarefoundation.github.io/autoware.universe/main/planning/obstacle_stop_planner/).
+
+!!! note "[`obstacle_stop_planner.param.yaml`](https://github.com/autowarefoundation/autoware_launch/blob/main/autoware_launch/config/planning/scenario_planning/lane_driving/motion_planning/obstacle_stop_planner/obstacle_stop_planner.param.yaml) parameter file:"
+
+ ```diff
+ detection_area:
+ - lateral_margin: 0.0 # margin [m]
+ + lateral_margin: 0.6 # margin [m]
+ ```
+
+The following image illustrates the motion behavior of the trajectory,
+considering the lateral_margin and max_longitudinal_margin parameters.
+
+
+
+Autoware has a lot of changeable parameters and methods.
+The mentioned sample parameter tuning examples are just a small part of what is possible.
+For more information on the modules and their parameters,
+please refer to the [Autoware Universe documentation](https://autowarefoundation.github.io/autoware.universe/main/).