Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.

Unlimited Access

Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Exam 312-50v11 topic 1 question 177 discussion

Actual exam question from ECCouncil's 312-50v11
Question #: 177
Topic #: 1
[All 312-50v11 Questions]

You start performing a penetration test against a specific website and have decided to start from grabbing all the links from the main page.
What is the best Linux pipe to achieve your milestone?

  • A. wget https://site.com | grep ג€<a href=\ג€httpג€ | grep ג€site.comג€
  • B. curl -s https://site.com | grep ג€<a href=\ג€httpג€ | grep ג€site.comג€ | cut -d ג€\ג€ג€ -f 2
  • C. dirb https://site.com | grep ג€siteג€
  • D. wget https://site.com | cut -d ג€httpג€
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
ronxz
Highly Voted 1 year, 10 months ago
Selected Answer: B
I tried wget, but it simply downloaded webpage, its output wasn't piped to grep. Then I tried curl with example.com: curl -s https://example.com | grep "<a href=\"http" | grep "iana.org" | cut -d "\"" -f 2 Output: https://www.iana.org/domains/example Explanation: curl -s = quiet/silent, no progress meter/error messages grep "<a href=\"http" = grep lines with hyperlinks to URLs, quotation mark is escaped by backslash grep "iana.org" = grep lines with iana.org domain cut -d "\"" -f 2 = output only 2nd field in each grepped line, fields in grepped lines are delimited by quotation marks, quotation mark is escaped by backslash here too
upvoted 14 times
...
victorfs
Most Recent 11 months, 3 weeks ago
Selected Answer: B
The correct option is B
upvoted 1 times
...
victorfs
11 months, 3 weeks ago
The correct option is B
upvoted 1 times
...
crimson_18
1 year, 1 month ago
Selected Answer: B
should be B
upvoted 1 times
...
flinux
1 year, 7 months ago
Selected Answer: B
the answer is B
upvoted 2 times
...
bsto
1 year, 8 months ago
Selected Answer: B
Is the B.
upvoted 1 times
...
juan201061
1 year, 9 months ago
Selected Answer: B
Is the B.
upvoted 2 times
...
SeaH0rse66
1 year, 11 months ago
Selected Answer: A
wget | grep "< a href=\*http" | grep "site.com"
upvoted 1 times
...
cazzobsb
2 years ago
Selected Answer: B
correct
upvoted 1 times
...
Gilo
2 years, 1 month ago
Selected Answer: B
Defo B
upvoted 1 times
...
Urltenm
2 years, 1 month ago
I prefer B. You will see whole list on your screen, just try it.
upvoted 1 times
...
gokhansah1n
2 years, 2 months ago
Selected Answer: B
wget saves index.html to a file, curl prints out the screen requested web resource, and with commands concatenated with pipes give links inside the web page. The answer is B. You should try in a shell of a linux system to see directly
upvoted 3 times
...
Oliverotuns
2 years, 2 months ago
Probably B
upvoted 1 times
...
SH_
2 years, 2 months ago
Selected Answer: B
Try this out and see that the answer is B.
upvoted 1 times
...
spydog
2 years, 2 months ago
Selected Answer: B
By default wget will save page content to a file, so piping to grep will not work. Indeed wget can return page content to standard output, but it requires additional argument flag for that. Even if we accept that wget will return to standard output, the grep command will return only URLs that contain specific domain - not all URLs. Curl will return page to standard output, which can be piped to grep to list only URLs (href tag), and then strip the HTML tags to leave the URLs only
upvoted 4 times
...
andreiiar
2 years, 3 months ago
Answer is B. `curl` outputs to stdout which makes it suitable to pipe to grep. `wget` just saves to a file (unless you use flag `-O -`) Tested on Ubuntu 20.04 ``` $ curl -s http://example.com/ | grep '<a href' | cut -d"\"" -f2 https://www.iana.org/domains/example ```
upvoted 2 times
...
ProveCert
2 years, 4 months ago
Selected Answer: A
wget | grep "< a href=\*http" | grep "site.com"
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...