Copy-Paste in Hyprland across Wayland & XWayland

After a certain upgrade of hyprland, I can no longer copy&paste across Wayland & XWayland apps, which is very annoying.
The relatied issue: https://github.com/hyprwm/Hyprland/issues/6132
Maybe fixed in: https://github.com/hyprwm/Hyprland/pull/6086

According to 6132 issue, some provide a walk around. I tailored it to adapt it to my system.

Create a shell in ~ directory, named clipsync.sh

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
#!/usr/bin/env sh
# Two-way clipboard syncronization between Wayland and X11.
# Requires: wl-clipboard, xclip, clipnotify.
#
# Usage:
# clipsync.sh watch - run in background.
# clipsync.sh kill - kill all background processes.
# echo -n any | clipsync.sh insert - insert clipboard content fron stdin.
#
# Workaround for issue:
# "Clipboard synchronization between wayland and xwayland clients broken"
# https://github.com/hyprwm/Hyprland/issues/6132

# Updates clipboard content of both Wayland and X11 if current clipboard content differs.
# Usage: echo -e "1\n2" | clipsync insert
insert() {
# Read all the piped input into variable.
value=$(cat)
wValue="$(wl-paste)"
xValue="$(xclip -o -selection clipboard)"

notify() {
notify-send -u low -c clipboard "$1" "$value"
}

if [ "$value" != "$wValue" ]; then
notify "Wayland"
echo -n "$value" | wl-copy
fi

if [ "$value" != "$xValue" ]; then
notify "X11"
echo -n "$value" | xclip -selection clipboard
fi
}

watch() {
# Wayland -> X11
wl-paste --type text --watch "/home/cyl/clipsync.sh" insert &

# X11 -> Wayland
while clipnotify; do
xclip -o -selection clipboard | ~/clipsync.sh insert
done &
}

kill() {
pkill wl-paste
pkill clipnotify
pkill xclip
pkill clipsync
}

"$@"

alias in ~/.zshrc:

1
alias clipsync="~/clipsync.sh"

enable it by running

1
clipsync watch

kill all by running

1
clipsync kill

Self-start, configured in ~/.config/hyprland/hyprland.conf

1
exec-once = clipsync watch

REMEMBER TO REMOVE ALL THESE STUFF WHEN HYPRLAND HAS FIXED THE ISSUE

Solved in ISSUE 6086

Automate OpenROAD testing

Purpose:

We use OpenROAD to generate timing analysis reports for each design. However, manually modifying parameters, running commands, and intercepting important information will be too arduous work. It is mainly due to the following points.

  • First of all, the report generated by OpenROAD for each design is too long, and the report file may be as large as hundreds of megabytes. It is almost impossible to open and find tns, wns data using a text editor.
  • Secondly, the locations of the configuration files running OpenROAD are very scattered. Moreover, the configuration file directory for timing modification is stored in the tcl script. Manual search and modification are too anti-human and prone to misoperation.
  • Furthermore, the amount of data we need is very large. For each design, we hope to test its performance under 7 different clock cycles and 5 different core area sizes. In other words, for each design, it is necessary to manually modify parameters, run commands, and intercept important information 35 times, which is unbearable, not to mention that we have 7 designs.
    Due to these drawbacks, I decided to use shell script to automate the process of OpenROAD testing. I have the following requirements for the script.
  • The script can modify the core_area, die_area parameter in the tcl script given the path to the tcl script and the desired scale of area.
  • The script can modify the create_clock‘s -period parameter in the sdc configuration file referred by the tcl script, given the path to the tcl script and the desired clock period.
  • The script can set all the parameters and run OpenROAD analysis one by one, given the choice set of clock period and area scale.
  • After the OpenROAD generated the report file, the script can read the report and extract the information of tns & wns for each stop point and export these data to a csv table.

Main Test Framework

The entry script is named as runit.gxy, which can receive the name of design as an argument.
For each design, the corresponding tcl script is fixed, so I decided to map the name of design to the tcl script.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
design="$1"
if [ "${design}" = "APU" ]; then
file_path_orig="../dataset/run/greedy/APU/10n.tcl"
file_path="../dataset/run/greedy/APU/10n_bak.tcl"
elif [ "${design}" = "ppu" ]; then
file_path_orig="../dataset/run/greedy/ppu/5n.tcl"
file_path="../dataset/run/greedy/ppu/5n_bak.tcl"
elif [ "${design}" = "xtea" ]; then
file_path_orig="../dataset/run/greedy/xtea/5n.tcl"
file_path="../dataset/run/greedy/xtea/5n_bak.tcl"
elif [ "${design}" = "yhuff" ]; then
file_path_orig="../dataset/run/greedy/yh/5n.tcl"
file_path="../dataset/run/greedy/yh/5n_bak.tcl"
elif [ "${design}" = "dynamic_node" ]; then
file_path_orig="../dataset/run/greedy/dy/5n2.tcl"
file_path="../dataset/run/greedy/dy/5n2_bak.tcl"
elif [ "${design}" = "ibex" ]; then
file_path_orig="../dataset/run/ibex_tcl/C0A0.tcl"
file_path="../dataset/run/ibex_tcl/C0A0_bak.tcl"
elif [ "${design}" = "salsa20" ]; then
file_path_orig="../dataset/run/greedy/sal/5n.tcl"
file_path="../dataset/run/greedy/sal/5n_bak.tcl"
else
echo "no such design"
exit 1
fi

if [ ! -f "$file_path" ]; then
echo "File $file_path not found."
if [ ! -f "$file_path_orig" ]; then
echo "Origin file ${file_path_orig} not found."
exit 1
fi
cp $file_path_orig $file_path
fi

The reason I use _bak post-fix is that, I want to reserve the original core_area and die_area configuration in the origin tcl file, since the original configuration is needed in the following steps.

Then I use nested loop to change the configuration one by one. The choice set can be modified in the condition statement of each for loop.

1
2
3
4
5
6
7
8
9
10
11
for period in 2 4 6 8 10 12 14
do
# set clk period
for scale in 0.2 0.4 0.6 0.8 1
do
# set area scale
#run openroad
timeout 5 ../../OpenROAD-flow-scripts/tools/OpenROAD/build/src/openroad $file_path > $output_path
# collect data and export to csv
done
done

The set clk period, set area scale, export to csv function will be covered in the following steps.

Set the Period of Clock

I create a new shell script called set_clk.sh to manage the automation of changing clock period.
It takes two arguments, the path to the tcl script (string) and the desired clock period (integer).
I first preprocess the period parameter, namely, divide it by 10.

1
2
3
4
5
time="$2"
period=$(echo "scale=1; $time / 10.0" | bc)
if [[ $period =~ ^\. ]]; then
period="0$period"
fi

The clock configuration can be modified in the sdc file, while the path to the sdc file can be extracted from the tcl file, with the format of set sdc_file "PATH_TO_SDC"
I decided to use grep to locate the line of set sdc_file and extract the path to sdc file using regex substitution.

1
2
tcl_path="$1"
sdc_path=$(grep 'set sdc_file' "$tcl_path" | sed 's/.*"\(.*\)".*/\1/')

In the sdc file, the configuration has a format of create_clock ... -period 0.x.
Still, use regex substitution to replace the original period with the input period and everything is done.

1
sed -i "s/^\(create_clock.*-period \)[0-9]*\.*[0-9]\(.*\)$/\1$period\2/" "$sdc_path"

Set the core_area and die_area

The area of the design is expressed in the form of {x1 y1 x2 y2}, where x1 y1 stands for the coordinate of bottom-left corner and x2 y2 stands for the coordinate of upper-right corner.
Since the area of different design varies a lot, I need to change the area proportionately. As a result I need to reserve the original area stored in tcl script as mentioned before.

First, I read the origianl core_area and die_area from the original script.

1
2
3
4
5
6
7
8
9
file_path="$1"
orig_path="${file_path//_bak}" # remove the _bak postfix_

core_area=$(grep 'set core_area' "$orig_path" | sed -n 's/.*{\([^}]*\)}.*/\1/p')
die_area=$(grep 'set die_area' "$orig_path" | sed -n 's/.*{\([^}]*\)}.*/\1/p')

IFS=' ' read -r core1 core2 core3 core4 <<< "$core_area"

IFS=' ' read -r die1 die2 die3 die4 <<< "$die_area"

Then I calculate the desired core/die area size based on the scale and original area.

1
2
3
4
5
6
7
8
9
10
11
12
13
core_x_size=$(echo "$core3" - "$core1" | bc)
core_y_size=$(echo "$core4" - "$core2" | bc)

core1=$(echo "$core1 + $core_x_size * (1 - $2) / 2" | bc)
core3=$(echo "$core3 - $core_x_size * (1 - $2) / 2" | bc)

core2=$(echo "$core2 + $core_y_size * (1 - $2) / 2" | bc)
core4=$(echo "$core4 - $core_y_size * (1 - $2) / 2" | bc)

die1=$(echo "$core1" - "10" | bc)
die2=$(echo "$core2" - "10" | bc)
die3=$(echo "$core3" + "10" | bc)
die4=$(echo "$core4" + "10" | bc)

Finally, use regex substitution to modify the area configuration in _bak.tcl script.

1
2
sed -i "s/die_area {[^}]*}/die_area {$die1 $die2 $die3 $die4}/g" "$file_path"
sed -i "s/core_area {[^}]*}/core_area {$core1 $core2 $core3 $core4}/g" "$file_path"

Collect data and export to .csv

The key information I need from the report generated by OpenROAD is tns & wns.
By observation, the block containing the data has the following pattern.

1
2
3
4
5
***
result of: stoppoint*
tns: [number1]
wns: [number2]
***

Therefore, I use the grep and awk command to get the last word in each data block. In this case, it’s stoppoint*, [number1], [number2].

1
2
input_file="$1"
tnswns=$(grep -A 2 "result of" "$input_file" | awk '/result of/{p=3} p&&p--' | awk '{print $NF}')

Each report contains totally 6 valid stop points, so I use a loop to get the desired data.

1
2
3
4
5
6
7
8
9
csv_file="${design}.csv"
for nr in 2 5 8 11 14 17
do
stoppoint=$((stoppoint+1))
tns=$(echo "$tnswns" | awk -v nr="$nr" 'NR==nr')
nr_wns=$(echo "$nr + 1" | bc)
wns=$(echo "$tnswns" | awk -v nr="$nr_wns" 'NR==nr')
# export to csv table
done

After getting the desired tns and tws, I want to export these data to a csv table.
I decided to let the table has the following columns, period, scale, stop point number, tns and wns, one such table for each design.
This function can be achieved with the following script

1
2
3
4
design="$2"
period="$3"
scale="$4"
echo "${period},${scale},stoppoint${stoppoint},${tns},${wns}" >> $csv_file

Replace the comment in the loop and everything is done.

Result

I assemble runit_gxy.sh in a single script named runit_cyl.sh to run the analysis of all designs at once.

1
2
3
./runit_gxy.sh APU
./runit_gxy.sh ppu
...

Then redirect the output to runit_cyl.log.

Here is the content of APU.csv and runit_cyl.log

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
Period,Scale,Stoppoint,tns,wns
2,1,stoppoint1,-17.2309532166,-0.4609909952
2,1,stoppoint2,-13.7819681168,-0.4073848128
2,1,stoppoint3,-13.7819681168,-0.4073848128
2,1,stoppoint4,-14.1436796188,-0.3814869523
2,1,stoppoint5,-14.1436796188,-0.3814869523
2,1,stoppoint6,-14.1436796188,-0.3814869523
2,0.95,stoppoint1,-17.2309532166,-0.4609909952
2,0.95,stoppoint2,,
2,0.95,stoppoint3,,
2,0.95,stoppoint4,,
2,0.95,stoppoint5,,
2,0.95,stoppoint6,,
2,0.9,stoppoint1,-17.2309532166,-0.4609909952
2,0.9,stoppoint2,,
2,0.9,stoppoint3,,
2,0.9,stoppoint4,,
2,0.9,stoppoint5,,
2,0.9,stoppoint6,,
4,1,stoppoint1,-8.3930120468,-0.2609909773
4,1,stoppoint2,-4.7969579697,-0.2073848099
4,1,stoppoint3,-4.7969579697,-0.2073848099
4,1,stoppoint4,-4.5129084587,-0.1783644110
4,1,stoppoint5,-4.5129084587,-0.1783644110
4,1,stoppoint6,-4.5129084587,-0.1783644110
4,0.95,stoppoint1,-8.3930120468,-0.2609909773
4,0.95,stoppoint2,,
4,0.95,stoppoint3,,
4,0.95,stoppoint4,,
4,0.95,stoppoint5,,
4,0.95,stoppoint6,,
4,0.9,stoppoint1,-8.3930120468,-0.2609909773
4,0.9,stoppoint2,,
4,0.9,stoppoint3,,
4,0.9,stoppoint4,,
4,0.9,stoppoint5,,
4,0.9,stoppoint6,,
6,1,stoppoint1,-1.8987674713,-0.0609909929
6,1,stoppoint2,-0.0572821237,-0.0073847598
6,1,stoppoint3,-0.0572821237,-0.0073847598
6,1,stoppoint4,0.0000000000,0.0000000000
6,1,stoppoint5,0.0000000000,0.0000000000
6,1,stoppoint6,0.0000000000,0.0000000000
...

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
use design: APU
use tcl file path ../dataset/run/greedy/APU/10n_bak.tcl
get sdc path: ../dataset/run/greedy/gcd/1/gcd_nangate45.sdc
Change period to 0.2
create_clock [get_ports clk] -name core_clock -period 0.2
core area difference 60.18 59.8
set die_area {.07 1.2 80.25 81}
set core_area {10.07 11.2 70.25 71}
collecting data
core area difference 60.18 59.8
set die_area {1.07 2.2 79.25 80}
set core_area {11.07 12.2 69.25 70}
collecting data
core area difference 60.18 59.8
set die_area {3.07 3.2 77.25 79}
set core_area {13.07 13.2 67.25 69}
collecting data
get sdc path: ../dataset/run/greedy/gcd/1/gcd_nangate45.sdc
Change period to 0.4
create_clock [get_ports clk] -name core_clock -period 0.4
core area difference 60.18 59.8
set die_area {.07 1.2 80.25 81}
set core_area {10.07 11.2 70.25 71}
collecting data
core area difference 60.18 59.8
set die_area {1.07 2.2 79.25 80}
set core_area {11.07 12.2 69.25 70}
collecting data
core area difference 60.18 59.8
set die_area {3.07 3.2 77.25 79}
set core_area {13.07 13.2 67.25 69}
collecting data
get sdc path: ../dataset/run/greedy/gcd/1/gcd_nangate45.sdc
Change period to 0.6
create_clock [get_ports clk] -name core_clock -period 0.6
core area difference 60.18 59.8
set die_area {.07 1.2 80.25 81}
set core_area {10.07 11.2 70.25 71}
collecting data
...

Implement Build-in Command in C

Which are the build-in commands?

A shell built-in command is contained in the shell itself (implemented by shell author) and runs on the same shell without creating a new process, e.g. cd, pwd, exit, alias.

By contrast: For a regular(linux) command, it is run from a binary file (found in $PATH or specified file) by forking the existing process in a subshell, e.g. ls, cat, rm.

How to implement

1. Judge whether is build-in command

Assume we have already got the parsed arguments

char * parsedArgs[MAXPARSE].

For example, a input char * input = "cd .." can be parsed as char * parsedArgs[MAXPARSE] = {"cd", ".."}

Then, we just need to judge whether parsedArgs[0] is contained in our build-in command set {"exit", "cd", "pwd" ...} (by using strcmp)

2. Implementation

In this project (as of milestone 2) , we need to implement exit, pwd, cd.

exit

In Bash, exit is a built-in command used to terminate the current shell session (same in our mumsh).

Each running shell is a process, so we can use the exit() in stdlib.h to terminate the shell process, achieving the goal of exiting the shell.

1
2
3
4
void exec_exit(){
printf("exit\n");
exit(0); // 0 for exit normally, otherwise abnormally
}

pwd

Running a crude (or minimal?) shell and have no idea where you are? Just type pwd and it will print the name of the current/working directory.

In C, we can implement the similar functionality using the getcwd() in unistd.h.

if success, getcwd() will return the pointer to the result, else ( e.g. no enough space) return NULL

1
2
3
4
void exec_pwd(){
char buf[1024]; // memory space need for storage
printf("%s\n", getcwd(buf,sizeof(buf)));
}

However, the working directory is not guaranteed to be shorter than 1024, so we may need to dynamically allocate the buffer to suit the length.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
int size = 100;
char *buffer = malloc(size);

while (getcwd(buffer, size) == NULL) {
if (errno == ERANGE) { // buffer too small
size *= 2;
buffer = realloc(buffer, size);
} else { // other error
...
free(buffer);
return 1;
}
}

printf("%s\n", buffer);

free(buffer);

cd

cd is used for changing working directory.

In C, we can implement the similar functionality using the chdir() in unistd.h.

if success, chdir() will return 0, else ( e.g. no such directory) return -1

1
2
3
void exec_cd(char * path){
chdir(path);
}

Very straight forward, but if we want to implement more advanced features of the cd command such as changing to the home directory or the previous directory, we would need to handle these cases manually.

For example, you could check if the argument is ~ or -, and then call chdir() with the appropriate path