Register

If this is your first visit, please click the Sign Up now button to begin the process of creating your account so you can begin posting on our forums! The Sign Up process will only take up about a minute of two of your time.

Results 1 to 5 of 5
  1. #1
    Member fortm's Avatar
    Join Date
    Jul 2011
    Posts
    30
    Member #
    28679
    I wanted to know if there is any way a website can be safeguarded from general tresspassing as a whole.
    If there are some sensitive data in website , what should be the way forward ?

    By little googling, I have so far found out that most of the popular websites have a "robots.txt" file in their home directory which has instructions on accesses and permissions.
    But, at the same time, I also think robots.txt can be fooled as it only instructs without imposing..

    Is https the only alternative ? I will have to pay more for certificates then ..

    I wanted to prevent code base in few directories, though their functionality will very much be visible..
    My hosting platform is linux without database support..

    Thanks !

  2.  

  3. #2
    Senior Member Webzarus's Avatar
    Join Date
    May 2011
    Location
    South Carolina Coast
    Posts
    3,322
    Member #
    27709
    Liked
    770 times
    Robots.txt are only requested by legitimate search engines

    If your host allows setting permissions on the folder level.. That will utilize the servers security to who gets access to files inside that folder...

    Some level of access control can be done with scripting language where you setup a login and password to access a section... But this Is really only usable on pages you can include the authentication script on..

    If you have files that you want to keep control over who accesses... You should go with folder level security from your hosting provider...

  4. #3
    Unpaid WDF Intern TheGAME1264's Avatar
    Join Date
    Dec 2002
    Location
    Not from USA
    Posts
    14,483
    Member #
    425
    Liked
    2783 times
    What Webzarus said. robots.txt is ignored completely by rogue bots (some of which have been documented in very-not-safe-for-work language by this dude).

    If you want it protected, folder-level permissions are the best way to go. If your host won't set them up and you're willing to switch to a Windows server, the web.config folder in conjunction with ASP.net can be used to password-protect folders and files as well.

    http://weblogs.asp.net/gurusarkar/ar...eb-config.aspx <-- here's how you'd do it.

    The advantage to this route is that you can easily control, via database or whatever other means you see fit, who accesses what.

    By the way...yes, I saw where the platform used was Linux, but I also wanted to mention it as an alternative and in the event that another forum user is looking for security info and either doesn't care about the platform or is looking for something Windows-specific (you never know what people are looking for these days).
    fortm likes this.
    If I've helped you out in any way, please pay it forward. My wife and I are walking for Autism Speaks. Please donate, and thanks.

    If someone helped you out, be sure to "Like" their post and/or help them in kind. The "Like" link is on the bottom right of each post, beside the "Share" link.

    My stuff (well, some of it): My bowling alley site | Canadian Postal Code Info (beta)

  5. #4
    Member fortm's Avatar
    Join Date
    Jul 2011
    Posts
    30
    Member #
    28679
    [COLOR=rgb(20, 20, 20)]Since i can't switch to windows now, I tried folder level permissions..

    If you want it protected, folder-level permissions are the best way to go.
    luckily, ftp client of web hosting provider has options to set permissions for folders or files.

    So, I changed permission to 700 for a folder, but then website opened in broken way..
    Then, I changed it to 711, now website opens but files are not secured as they still get copied by ..

    Can you suggest what permission level should be ?
    [/COLOR]

  6. #5
    Senior Member Webzarus's Avatar
    Join Date
    May 2011
    Location
    South Carolina Coast
    Posts
    3,322
    Member #
    27709
    Liked
    770 times
    Those folder permissions are only for on server file needs... Will not work for people accessing files on the server.

    You should have some type of control panel associated with your hosting pacakage... Ther should be something in there that will allow you to assign a username and password to access a specific folder.

    If you are using a free or really cheap hosting package... This might not be an option.


Remove Ads

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
All times are GMT -6. The time now is 06:14 PM.
Powered by vBulletin® Version 4.2.3
Copyright © 2019 vBulletin Solutions, Inc. All rights reserved.
vBulletin Skin By: PurevB.com