Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Have been writing some awk scripts. For each type of file I have a different implementation for what needs to do. Would it be possible to have everything in a single file, and if so, can I see an exampled of how such a thing can be done?
It depends on the specific AWK scripts which you have and how you decide to merge them but in principle it can be done.
I'd write some example but LQ is blocking it with some javascript:
Code:
Icon for www.linuxquestions.org www.linuxquestions.org
Checking if the site connection is secure
Enable JavaScript and cookies to continue
www.linuxquestions.org needs to review the security of your connection before proceeding.
The example would have been above and started with #!/usr/bin/awk -f as the shebang.
Client-side processing is inappropriate on so many levels.
More efficient: keep your different awk scripts and test/branch in the calling shell script.
Code:
#!/bin/sh
for fn in filea.e1 fileb.e2 filec.e3
do
case ${fn##*.} in
( e1 ) echo "run awk.e1 on $fn"
;;
( e2 ) echo "run awk.e2 on $fn"
;;
( * ) echo "do other action on $fn"
esac
done
@MadeInGermany The more efficient version is what I currently have (separate files for different purposes).
I could not improve on the aforementioned strategy. It is a shame that awk and sed suffer such limitations.
In shell scripts one can conveniently use functions for different tasks. Although with awk one can specify
function, they are not identical to bash functions.
Can one put BEGIN clauses in functions and whatever? The -f option reads the awk program source from source-file, rather than using a specific function inside an awk file. In a bash script for instance, one can call a specific function, which itself can take positional arguments. And also be able to call a different bash funcion doing completely different things in the same bash file. I cannot see that I can do the same thing for awk files.
One still ends up for multiple files, each of which is quite short. Right? Then it is much easier to just call awk commands from bash without the need to put the awk code in separate awk files.
@MadeInGermany Sure, one can use make awk functions and call them within an awk scripts. But can one access specific awk functions from bash? Can one include different BEGIN and END clauses in different awk functions. And can one write patterns and actions in awk functions?
@MadeInGermany Suce, one can use make awk functions and call them within an awk scripts. But can one access specific awk functions from bash? Can one include different BEGIN and END clauses in different awk functions. And can one write patterns and actions in awk functions?
Actually I don't really understand it. awk and bash are two different languages. You can start any number of awk scripts from bash and also you can start any number of scripts from awk too. But you cannot invoke a function written in awk directly in bash.
In such cases you can put your awk scripts into different files (or variables) and run (execute) the one you need. But I think you need to use only one language to implement a functionality, mixing two (or more) will make things definitely harder.
bash is good enough in a lot of cases, but cannot do calculations very well.
awk and shell are different languages. They don't share anything (e.g. variables and functions are separate).
The shell has good interfaces to the processes that it runs.
BEGIN and END can run functions.
A function cannot contain BEGIN or END, why should it??
Do you mean
pattern {action}
?
You can always have an if clause in an action block:
{if (pattern) {further action}}
regardless if the action block is in the main (input loop) section or in a BEGIN or END or function.
I have been scrutinising the possibility of writing various awk functionalities without actually requiring a separate awk file for each task. Bash scripts seem to be much more useful.
Thank you for your statement regarding the non-permissibility of having BEGIN or END clauses in awk functions. My reason for my question was the possibility of having multiple BEGIN clauses in the same file with an option to do a particular one. This to avoid having a separate awk file. My awk files for the different things I do are very small, few tens of lines and was looking forward to the task of putting them together in a single file.
From the discussion, I would be better served using small separate files as I am currently doing. Would using positional argument options possibly handle different BEGIN clauses in an awk file?
is that the pattern is always tested. Suppose I do not want the pattern to be tested in certain situations. Could I avoid testing for pattern? Could I just branch so it does nothing?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.