John Davidson

server - Command for downloading all website project files

0 comments
Message:


I have access to server of our company's website. I access it using Putty.
I want to download all files(index.php, any jpg files for favicons, etc.) used for deployment of the web project. When I told to programmer of website to share the full script(HTML,CSS,PHP, jquery plugins), he said that I can access the entire code from server.
When I enter the server via Putty(private key+ssh), and then I type "ls" I see that there are "index.php" and "mysql" files, the full contents of which I cannot download.
What are the useful resources for list of commands?
Which command should I use to download the project folder containing all files with code and without code?
P.S. I do not know if this information is necessary, but the website was deployed using DigitalOcean.



My Answer:
the following command to download all website project files:

bash
scp -r username@server_ip:/path/to/website/directory /local/path/to/save/files


Replace `username` with your server username, `server_ip` with the IP address of the server, `/path/to/website/directory` with the directory where the website files are located on the server, and `/local/path/to/save/files` with the local directory where you want to save the downloaded files.

This command will recursively copy all files and directories from the specified website directory on the server to your local machine.

Rate this post

4 of 5 based on 6367 votes

Comments




© 2024 Hayatsk.info - Personal Blogs Platform. All Rights Reserved.
Create blog  |  Privacy Policy  |  Terms & Conditions  |  Contact Us