Bash Auto SSH and execute script

I have roughly 12 computers that each have the same script on them. This script merely pings all the other machines, and prints out whether the machine is "reachable" or "unreachable". However, it is inefficient to login to each machine manually using ssh to execute this script. Suppose I'm logged into node 1. Is there any way to for me to login to node 2-12 automatically using SSH, execute the ping script, pipe the results to a file, logout and proceed to the next machine? Some kind of bash s

Bash Shell Script: Nested Select Statements

I have A Script that has a Select statement to go to multiple sub select statements however once there I can not seem to figure out how to get it to go back to the main script. also if possible i would like it to re-list the options #!/bin/bash PS3='Option = ' MAINOPTIONS="Apache Postfix Dovecot All Quit" APACHEOPTIONS="Restart Start Stop Status" POSTFIXOPTIONS="Restart Start Stop Status" DOVECOTOPTIONS="Restart Start Stop Status"

Bash extract unique value from a log4j log file

Im having trouble extracting only a matching string: OPER^ from a log4j file. I can get this value from two different sources inside my log file: 2012-01-26 03:06:45,428 INFO [NP_OSS] OSSBSSGWIMPL6000|**OPR20120126120537008893**|GenServiceDeactivationResponse :: processRequestGenServiceDeactivationResponse() :: or: 2012-01-26 03:06:45,411 INFO [NP_OSS] MESSAGE_DATA = <?xml version="1.0" encoding="UTF-8" standalone="yes"?><ns2:ServiceDeactivationResponse xmlns:ns2="urn:ngn:foo">

Bash Using Functions inside here document

I have a script which has a few functions inside it which the main body uses to execute. Now, I want to run this script on 3 remote unix machines. Which is the neatest way to do this ? Most importantly, I don't to write a second script for remote connection. Everything should be inside this one script. I've tried heredoc with ssh, which is not working because of the big functions ! Code - #!/bin/bash # Year Month Day Related functions # FUNCTIONS # Find no. of days in a year yeardays() { #

Bash Execute command after the completion of Shell Script

I am trying to execute a command after execution of a shell script. Following code is what i have tried so far: #! /bin/bash exec >> /tmp/foo.log #$REPO and $AUTHOR are environmental varibles echo "test" >> /tmp/foo.log echo $REPO >> /tmp/foo.log echo $AUTHOR >> /tmp/foo.log exit 0 cd /var/www/html/websvn php remove_commits.php $REPO $AUTHOR The above script is not working for some reason. How can i fix this? Need your help badly.

Bash WGET doesn't work with plunder.com

Few days ago I purchased VPS with CentOS 6 and now I need to download some files from plunder.com using WGET command via SSH but I always get error: -bash: !: event not found Please help me!

Bash understanding parameter expansion with positional parameter

x11-common package installs a /etc/X11/Xsession.d/20x11-common_process-args script which is sourced in by /etc/X11/Xsession. This 20x11-common_process-args script contains following if-statement: has_option() { if [ "${OPTIONS#* $1}" != "$OPTIONS" ]; then return 0 else return 1 fi } OPTIONS variable is a list of configuration options from a file separated by line-feeds(0a in ASCII). How to understand this if-statement? Literally, this parameter expansion part should modify the O

Bash Usage function not working as expected

I am writing a quick function to display some help type information however it always runs regardless of whether an argument is given or not: help(){ if [ $# -eq 0 ] ; then echo '' echo '########################################################' echo '' echo 'Argument to run run name must be given: ./report.sh Name' echo '' echo 'Name can be:' echo '' ALLNAMES=$(awk -F'|' '{print $1}' $CONFIGFILE) echo "$ALLNAMES" echo '' echo '########

Bash Extract date from log file

I have a log line like this: Tue Dec 2 10:03:46 2014 1 10.0.0.1 0 /home/test4/TEST_LOGIN_201312021003.201412021003.23872.sqlLdr b _ i r test4 ftp 0 * c And I can print date value of this line like this. echo $log | awk '{print $9}' | grep -oP '(?<!\d)201\d{9}' | head -n 1 I have another log line like this, how can I print date value? Tue Dec 9 10:48:13 2014 1 10.0.0.1 80 /home/DATA1/2014/12/11/16/20/blablabla_data-2014_12_11_16_20.txt b _ i r spy ftp 0 * c I tried my awk/grep solu

Bash How to delete alike Apple's Time Machine?

Time Machine saves: hourly backups for the past 24 hours, daily backups for the past month, weekly backups for everything older than a month until the volume runs out of space. At that point, Time Machine deletes the oldest weekly backup. I'm at the point where I already have the bash script (rsync) which makes backups every hour. The backups are folders named as "2015-01-01 08", where "08" is the hour. At some point folders older than 24h need to be deleted. So I'm looking for this magic

bash awk get numbers in two digits

I want to correct wrong meta data or add missing meta data for the 75 cd's I have ripped from disc. I got the track info from AllMusic en stripped it to almost usable "CSV" data. Number";"1";"Piece";"Nocturne for piano No. 2 in E flat major, Op. 9/2, CT. 109";"Componist";"Frédéric Chopin MainPiece";"";"Piece";"Symphony No. 9 in E minor ("From the New World"), B. 178 (Op. 95) (first published as No. 5) Number";"2";"Piece";"Largo";"Componist";"Antonin Dvorák Number";"3";"Piece";"La plus que lente

Bash wget: delete incomplete files

I'm currently using a bash script to download several images using wget. Unfortunately the server I am downloading from is less than reliable and therefore sometimes when I'm downloading a file, the server will disconnect and the script will move onto the next file, leaving the previous one incomplete. In order to remedy this I've tried to add a second line after the script fetches all incomplete files using: wget -c myurl.com/image{1..3}.png This seems to work as wget goes back and compl

Sending bash commands to an open terminal buffer in emacs

I've been trying to improve my emacs life lately, and one thing I've done is make use of projectile and perspective to organize my buffers sensibly. As part of this I wrote an elisp function to open up (or return to) a named ansi-term buffer that is project-specific. This allows me to quickly drop into a bash terminal for the project I'm currently looking at. What I have been having trouble finding out after plumbing the interwebs is whether or not it is possible to send bash commands to an op

Bash move files to new folders with the same name as the file?

How can I move a group of files that share the first 9 characters of the name of the files to created folders with the same name of 9 characters example I have a folder containing number of files with various names each group of files begin with same 9 characters ex: first group [HD9523587_352, HD9523587_258, HD9523587_785 ,HD9523587_473] second group[Hip046329_258, Hip046329_364, Hip046329_681, Hip046329_235] and so on I need to make new folders with the same 9 characters

ubuntu 14.04 bash script permission error reading file into variable with $()

I have a bash script placed in a directory under my logged-in users home directory which read a file (also placed in same users home directory, deeper in the path relative to the bash script) to hold some variable for further processing in the script (simplified here for genericness): #!/bin/bash sEnv=$(./config/myfile.conf) echo $sEnv when running this file from bash with . run-me.sh the script fails with this error: -bash: ./config/myfile.conf: Permission denied permissions on the fil

bash: in a case statement, go to other case

I'm trying to write a script where I manipulate variables using options, but there are shortcuts if no option is specified. so for example, I have add, show, set, and delete, but if I don't specify any of those, then the input is interpreted to determine which operation to perform: script key is a shortcut for script show key script key=value is a shortcut for script set key=value I'm written a case statement for the options, but in the catchall, is it possible to say "go to case X"?: case

Bash IF statement in script and in a single command line

My script below checks for the instance of opened window in X server and it prints some info in the terminal depending on the state. #!/bin/bash if [[ -z $(xwininfo -tree -root | grep whatsapp | grep chromium) ]] then echo "IT DOES NOT EXIST"; else echo "IT EXIST"; fi When I try to rewrite this into a one line terminal command I do it like this: if -z $(xwininfo -tree -root | grep whatsapp | grep chromium); then echo "IT DOES NOT EXIST"; else echo "IT EXIST"; fi this return

Bash Calculate minimum and maximum value in the same rows usign awk

I have data set looks: A 10 A 12 A 13 B 10 B 25 B 66 B 80 C 2 C 3 I am able to calculate using AWK average values per the same rows (using array). I would like to add to my script minimum and maximum values. Is there any idea? My script: awk -v OFS="\t" '{v[$1]+=$2; n[$1]++}END {for (l in n) {print l "\t" v[l] / n[l]}}' > out.txt Thank you for any suggestion.

Bash How to create an alias/map to a list of commands with arguments passed in

I'm not sure what's the best approach to create this, but here's the flow of what I would like to do in this shell script. I basically have a list of commands/cli (passing in db connection string to postgres) and I want to create a shortcut by passing in an alias name along with another parameter (name) so that I can execute the command. I'm trying to use an associative array, but not sure if this is the way to do this. This is what I have so far. #!/bin/bash set -e set -x declare -A clust

Bash How can I run a Unix script every hour, but after 4AM?

I would like to let a script run every hour, but only if its after 4AM. My current attempt: if [ -e filename ] || [ date +%k%M < 400 ] then don't do anything else do something fi I am assuming that at 4AM date +%k%M would show 400 or is it 0400? Is there a better way to check if 4AM has been passed? Cheers Edit: I'm not able to use cron jobs. The script will run 24/7 and should check if it is after or before 5AM.

Bash Why can't I pipe awk outputs into a variable via >

I am trying to pipe awk output into a variable like this: $ awk -F : '/frost/{print $3}' /etc/group > $mygid $ echo $mygid $ But when I want to see the mygid variable it just hangs. I had to do it like this: $ "$(awk -F : '/frost/{print $3}' /etc/group)" to get it to work. I don't understand why. Thanks

Bash $? Variable Assignement

I just started learning bash and I was reading about the $? variable. From what I understood, $? is assigned to the exit status of the last command executed. For example $ false; echo $? Will yield 1, and $ false; :; echo $? Will yield 0 Things get a bit more complicated when I combine these with if/for blocks. The man page reads: for NAME [in WORDS ... ] ; do COMMANDS; done Execute commands for each member in a list. The for loop executes a sequence of commands for each me

Bash Switch user and store command output in variable

I'm trying to get command output into a variable when switching user in my script: checkOutput="$(echo command1 | su - user1)" But using echo on the variable does not return the desired result: echo $checkOutput Output: Oracle Corporation SunOS ... January 2005 logout Desired output (to be stored in variable): Java version ... Apache Qpid is RUNNING with PID=5227 When running command echo command1 | su - user1 in terminal the output is: Oracle Corporation SunOS ... January 2005 Java

Bash Given two -exec arguments to find, is there any elegant way to pass state or variables?

I was hoping for: find . [whatever] -exec VAR="$(another command)" \; -exec [use $VAR here] \; but this predictably falls apart because the two -exec are different processes. Any elegant way to pass state from a computation to another that is not bash -c "mess of escaped characters here" "{}"? I don't mind if it's a bash or GNU find extension. BTW, i can use the command into the cmd with $(secondary cmd) however, variable expansion not being lazy is screwing me over. If one tries: find .

bash script runs but does not display values

my script runs without any errors but does not display the variable values. The output of the screen is two spaced blank lines. #! /bin/bash set v1=25 set v2 [format "%c" $v1] echo "$v1" echo "$v2"

Bash Prevent pwgen -y generating backticks or quotes? (or otherwise sanitize output)

I'm using pwgen in a bash script. For security, we have to use the -y flag to include at least one special character. However, this frequently returns passwords with one or more of ` or " which break the surrounding script. Can I prevent these characters being generated? If not, what's the cleanest way to remove, replace or otherwise sanitize pwgen's output to exclude these characters? My current pwgen is; intPW=$(pwgen -c -n -y -B -1 15) which means intPW's value could be something like;

Bash output of sed gives strange result when using capture groups

I'm doing the following command in a bash: echo -e 'UNUSED\nURL: ^/tags/0.0.0/abcd' | sed -rn 's#^URL: \^/tags/([^/]+)/#\1#p' I think this should output only the matching lines and the content of the capture group. So I'm expecting 0.0.0 as the result. But I'm getting 0.0.0abcd Why contains the capture group parts from the left and the right side of the /? What I am doing wrong?

Translating OS X Bash Script for Windows

I use Hedge to transfer Magic Lantern video files shot on my Canon 5D Mark III. On OS X, I'm able to use Automator to set up a bash script, to execute an mlv_dump to transfer the files from MLV into cDNG sequences. The script I use currently is: cd "$(dirname "$1")" for f in "$@"; do if [[ -f $f ]]; then filename=${f##*/}; folder=${filename%.*} mkdir "${folder}"; ~/mlv_dump --dng $f -o ${folder}/${folder}_; fi done Can this easily translate into a Wi

Bash How to assign variables from the file in a loop

I'm trying to write a bash script that will read pairs of variables from the file and then uses them in a loop. i'm trying to use this while read p; do $p; echo "$a and $b"; done < text.txt with the text.txt containing the following: a="teststring"; b="anothertest" a="teststring1"; b="anothertest1" a="teststring2"; b="anothertest2" the output looks like that: bash: a="teststring";: command not found and bash: a="teststring1";: command not found and I have found similar question co

Bash awk numbered columns and ignore errors

The following works well and captures all 2nd column values for S_nn. The goal is to add numbers in the 2nd column. awk -F "," '/s_/ {cons = cons + $2} END {print cons}' G.csv How can I change this to add only when nnn is between N1 and N2 e.g. s_23 and s_24? Also is it possible to consider 1 if a line has junk instead of numbers in the 2nd column? S_22, 1 S_23, 0 S_24, 1 S_25, 1 S_26, ? Sample input: sum s_24 to s_26 Sample output: 1+1+1=3 (the last one is for error)

Bash Howto do floating point compasiosn in an if-statement within a GNU parallel block?

I want to run a batch process in parallel. For this I pipe a list to parallel. When I've an if-statement, that compares two floating point numbers (taken form here), the code doesn't run anymore. How can this be solved. LIMIT=25 ps | parallel -j2 ' echo "Do stuff for {} to determine NUM" NUM=33.3333 # set to demonstrate if (( $(echo "$NUM > $LIMIT" | bc -l) )); then echo "react..." fi echo "Do stuff..." ' Prints: Do stuff for \ \ PID\ TTY\ \ \ \

What is the best way to create and edit a file from a bash script?

I have a bash script which creates a brand new file. I want to populate that file with some commands. What is the best way to do that ? test.sh has sudo touch /tmp/myfile.txt I want myfile.txt to have the following lines of text Hello World! Today is a good day! .... How can I do that from test.sh ?

Bash if statement using "sed" to set variable not working

I'm trying to get an if statement to read the top line of a text file (tmp.txt) which has 0 on the last line. the "then" commands basically go into a directory and run a series of commands for DNA sequence analysis before coming back up, removing the top line of tmp.txt and moving onto the next directory listed in tmp.txt. once it gets to the end of all the listed directories the final line will just be a "0" or perhaps "file-end". The issue is, it's just not working and I can't figure out why.

Bash Script that will print HTTP headers for multiple servers

I've created the following bash script: #!/bin/bash for ip in $(cat targets.txt); do "curl -I -k https://"${ip}; "curl -I http://"${ip} done However I am not receiving the expected output, which is the HTTP header responses from IP addresses listed in targets.txt I'm not sure how curl can attempt both HTTP and HTTPS (80/443) within one command, so I've set two seperate curl commands.

Bash Copy only updated files, delete removed once and compress updated files

https://www.mehr-schulferien.de hosts some 100,000 webpages. For performance reasons they are not delivered from the actual web application but from a static mirror. This mirror gets generated on every 1st of a month. It's done by running wget -m. Followed by compressing all the files with gzip and brotli. Because this is a very old and slow server that process takes nearly two days. I'd like to optimize the process. Less then 10% of the pages actually get new content every month. Some are del

Bash Cannot install the "ipywidgets" Jupyter Lab Extension on AWS sagemaker

To install Jupyter Lab Extension on AWS sagemaker, You need to follow https://github.com/aws-samples/amazon-sagemaker-notebook-instance-lifecycle-config-samples/tree/master/scripts. And then create the lifecycle configuration accordingly. I did it and this is my on-start.sh file. #!/bin/bash set -e # OVERVIEW # This script installs a jupyterlab extension package in SageMaker Notebook Instance sudo -u ec2-user -i <<'EOF' # PARAMETERS EXTENSION_NAME=@jupyter-widgets/jupyterlab-manager s

Bash The 'pwd' command take much time cost after loading a big file(eg:40M)

The following test shell code test at: centOS 7 with bash shell; The code contains three phrase; phrase 1, call pwd command; phrase 2, read a big file(cat the file); phrase 3, do the same thing as phrase 1; The phrase 3 time cost is much bigger than phrase 1(eg: 21s vs 7s) But at the MacOS platform, the time cost of phrase 1 and phrase 3 is equal. #!/bin/bash #phrase 1 timeStart1=$(date +%s) for ((ip=1;ip<=10000;ip++)); do nc_result=$(pwd) done timeEnd1=$(date +%s) timeDelta=$((timeEnd

Bash prepending to the $PATH

In order to avoid ad-hoc setting of my PATH by the usual technique of blindly appending - I started hacking some code to prepend items to my path (asdf path for example). pathprepend() { for ARG in "$@" do export PATH=${${PATH}/:$"ARG"://} export PATH=${${PATH}/:$"ARG"//} export PATH=${${PATH}/$"ARG"://} export PATH=$ARG:${PATH} done } It's invoked like this : pathprepend /usr/local/bin and /usr/local/bin gets prepended to PATH. The script is also supposed to cleanly remove

Bash Append to protected file without creating a new line

I'm trying to make a bash script that, among other things, appends the string " fastboot noswap ro" to the end of /boot/cmdline.txt. At first I was trying sudo echo " fastboot noswap ro" >> /boot/cmdline.txt but was getting permission denied. I learned that for protected files, the best way to do it is echo ' fastboot noswap ro' | sudo tee -a /boot/cmdline.txt. That works, but it makes it start on a new line. I can't find anything in the tee man page about appending without inserting a

Bash escaping characters when passing JSON to aws secretsmanager

I have tried to write a script that updates AWS secrets. Yes, the update-secret command already does this, but that will overwrite existing secrets instead of merging them w/ the new content. For example, suppose my-environment/my-application/secrets has the following content: { "db_1_pwd": "secret"} If I run my script, like this: >> update_secret my-environment/my-application/secrets '{"db_2_pwd": "secreter"}' I would expect the new content to be: { "db_1_pwd": "secret", "db_2_pw

Parse a datetime from a string in bash

I have a string variable like the following: ./file_timestamp_2020_02_11_09_00_19 I would like to extract the datetime from this string and set it as a variable in the following format: 2020-02-11T09:00:19 I have tried the following without success: filename=$"./file_timestamp_2020_02_11_09_00_19" output=$($filename | grep -Eo '[[:digit:]]{4}_[[:digit:]]{2}_[[:digit:]]{2}_[[:digit:]]{2}_[[:digit:]]{2}_[[:digit:]]{2}') datetime=$($output +%Y_%m_%d_%H_%M_%S) I am getting the following err

Bash How to find NGINX+ end points from a given URL

I am automating a script for collecting some info, given a specific URL and returning all the server in the upstream Pool for that URL in a running NGINX+ instance. My goal is to have something that I can query on web, or a script that I can run hourly basis and create an HTML with it that I can query later with my script. I was planning to extract the info from the dashboard upstreams, but unfortunately the zones aren't descriptive enough and couldn't show all the matching URL in the regex de

How to programmatically create output redirection in bash

I'm writing a test script and it has a lot of repetition in it, and I'm looking for ways to reduce these code duplication. So I thought it would be nice to programmatically create the redirect part of the commands. Here is a minimal example: (In the real script I would generate different output file names in x()) #!/bin/bash set -x x() { echo '> out.txt 2> err.txt' } ./someProgram $(x) My hope would be that stdout would end up in out.txt and stderr in err.txt. But bash quotes the

Bash How to stop vsftpd server

I downloaded the vsftpd file (The latest vsftpd release is v3.0.3) and manually installed it. I then started the server using the command ./vsftpd_v1 vsftpd_v1.conf and terminated it using Ctrl-z. When I tried to start it again using the same command it showed the following error: 500 OOPS: could not bind listening IPv4 socket I searched the error and found that it means that server is already is running so could not bind another. So I am trying to stop the server but are not able to do.

Bash Check if there is only one of two types of files in a directory

I have a script that checks if there is only one file in a directory. However, I can't figure out how to check if there is only one executable (no file extension) or script (.sh) in that directory. Here's what I currently have: loc=(/Applications/*) APPROOTDIR="${loc[RANDOM % ${#loc[@]}]}/" APPDIR="${APPROOTDIR}Contents/MacOS/" echo "APPROOTDIR is ${APPROOTDIR}" echo "APPDIR is ${APPDIR}" FIAD=$(ls ${APPDIR}) if [ `ls -1 ${APPDIR}* 2>/dev/null | wc -l ` == 1 ]; then echo "One executable

rewrite the bash script in an efficient way to run based on server

I am working on a script and snippet of a logic is to get the process running on a server, I have a function stat_check has some steps to perform. I am wondering, is there any efficient way of doing this and decreasing number of lines. Goal - want to pass a unique variable to stat_check based on server. I can have list of servers on a separate file and/or in a variable mentioned below as Servers. Servers=(flipunix1 flipunix2 flipunix3 flipunix7) for i in ${Servers[*]} do if [ "$SERVER_NAME&

  1    2   3   4   5   6  ... 下一页 最后一页 共 298 页