Guide for Securing File Server

How hardened is your organization’s file server to protect data from unauthorized access? Every organization running a file server needs to have a way of protecting sensitive information from unauthorized access, especially from outside.

The fact that your server may be in a physically secure place does not mean the security configurations within the operating system should be ignored and the system left at the mercy of anyone with an access to the building.

The file server being the most visible and central network device with critical information should be secured using some of the approaches this guide proposes. System administrators should use the hacker’s point of view when reviewing security policies to help them find a better approach to server security settings.

The Guide

Securing a file server ensures that the organization enjoys the fill benefits of a server in its opium working condition. Here are some of the steps one needed to enforce server security settings.

1. The configuration of the operator group
Instead of using the default Administrators group, another group of people (user accounts) authorized to access the server using different access levels should be created. The default administrator account should be used as the final resort.

Operator Group can be defined by different tasks users are assigned as far as modifications of security settings are concerned. To perform this task, open the Organizational Unit and remove unwanted groups, but retain the default Domain Administrators. Create the Operators Group and give Full Control to both Domain Administrators and Operators group.

2. Create A Security Template
By the time you sit down to secure the File Server, you must already be having security settings that can be used on the network or in a standalone environment. Security setting templates can be created and then be imported into group policy to be implemented across the entire network.

The security templates need to be flexible in that when more than one is applied to the same server, there should never be a conflict.

Setting Account Policies
Account policies are settings that have to do with Passwords, Account Lockout, and Kerberos Policy. All policies should be applied at the domain level. For a standalone file server, the local accounts are used to access it, therefore security settings of these accounts need to be set up.

Other settings include:

  • Increasing the minimum password length – long passwords mean that the hacker longer time to crack local account passwords.
  • Decreasing maximum password age – reducing the aging date of a password means more frequent password changes, therefore, increasing the integrity of local user accounts.
  • Account lockout duration – this setting determines the lockout period a person has to wait to re-enter the password. A ‘0’ setting means the user logged out until password reset and a ’30’ means attackers will have to wait longer to key in more attempts.

Setting the Local Policies
Under local policies, system administrators need to look at the Audit policies, user rights, and security options.

  • Audit Policy
    Turing on the Audit Policy setting is an indication that the files, registry, folders, and printers can be tracked. This gives the administrators the freedom to choose which objects to log and the level of monitoring to use. In a high-security environment, knowing who is responsible for what activity is paramount, therefore auditing privileges is required when securing the File Server.
  • User Rights
    When setting user properties on a local machine, rights such as “Act as part of the operating system” should be disabled.
  • Security Options
    Some default settings under this option can be used to tighten the security levels of a file server. Some of these defaults may not work with all Server Operating Systems, therefore testing before implementation is recommended.

Settings on Services
Services not installed should be disabled and a person responsible for starting, stopping, and disabling them identified. There are two specific services that can be applied in specific scenarios: Distributed File System (DFS) and File Replication Service (NTRFS).

The DFS works on local disks, shared across networks, therefore in such a scenario, disabling DFS means that users must know the names of the shared resources and servers on the network. NTRFS controls the automatic maintenance of files across multiple servers and it must be on Active Directory and Domain Controllers.

3. Determine the Template Application Strategy
Security templates can be imported to Group Policies and the domain so that they can be shared across multiple workstations and can be refreshed as needed. When dealing with workstations, security policies can be set using the /secedit/ command and the batch files re-written to refresh the settings.

4. Set up Restricted Groups
Make use of the Local Group Policy to restrict group activities. Local Administrators and any group that was created can be restricted. New users can be added to the restricted groups. Restricting administrators who can change system settings prevents unauthorized access.

5. Write IPsec Policies
Setting IPsec rules that block certain or all ports by applying specific filters to allow only communication from specific computers will guarantee File Server security. When left open, File Servers share information through various protocols that would-be attackers will find useful.

6. Set the Correct System
The correct time is critical for many reasons. Some authentication protocols need the client and server clocks to be synchronized. Synchronizing events between computers on a network will not take place due to time differences. When reading logs, the time stamp is important for auditing purposes.

7. Set Specific Account Restrictions
Restrictions can be implemented on individual accounts by limiting the hours, restricting which workstation a particular user an use, prevent account delegation, etc.

8. Setting the Local Server Security Settings
Settings that cannot be automated in a domain such as Guest user account and Guest group have unique identifiers that make it difficult to have an automated security setting. In such scenarios, setting local policies is necessary.

9. Track Attack Indicators
Events with warnings such as ‘Logon Failure’ and an increasing number of similar events should be treated as unauthorized attempts.

10. Using the Network File System (NTFS)
Setting the Access Control List (ACL) and the System Access Control List (SACL) on FAT volumes is not possible. File Server security depends on its ability to have security settings done as file permissions.

11. Use of Administrative Template Settings
Some security settings are not available in security template settings. They can only be set via the Administrative Templates within the Group Policy. Disabling error reporting at this stage is recommended because some error logs contain sensitive information that can be intercepted by hackers.

12. Documenting and Maintaining the Security Settings
When server security settings need to be altered, previous settings can be used as a reference point. Third-party tools such as Security Configuration Analysis can be used to verify the compliance of the security settings in place.

Conclusion

The security settings above are supposed to make the File Server secure. However, when doing so, make sure the settings do not interfere with server’s normal operations. Since every network setup is unique, it is recommended that File Server security setting is set up according to the needs and expectation of the organization.

Are your Windows folders secure, too?

Protect yourself and your clients against security leaks and get your free trial of the easiest and fastest NTFS Permission Reporter now!

Prevent Data Theft with File Audit

If you are operating a business or running an organization, you definitely know how vital it is to have your data safe. Cybercrime is one of the most widespread threats facing businesses. According to the Identity Theft Resource Center (ITRC) in 2017 alone, extremes of 174 million records were breached.

Well, this is only what was documented. Meaning, if every theft was documented, this number would have been even higher. The saddening fact is that while the discovery of data breaches, normally takes a long time at times even measured in years, data exfiltration takes minutes to be executed. This, therefore, means that businesses need to invest in modern means of securing their data.

Where the Data is Most Likely to be Found

Most organizations store their important data on the servers. Cybercriminals are aware of this reality and that is why their efforts are targeted on the servers, where they can obtain files with important data such as bank and credit card details, personal health info, personally identifiable information (PII), corporation’s trade secrets, intellectual property, other credentials, and much more.

The business processes of the company being attacked determine where they keep all their data. For instance, they can keep data on their databases, office documents, files for data transfer operations, making them a good target for the theft of data.

A company, therefore, needs to launch a data security to audit to prevent losing important data, henceforth financially.

What Data Security Auditing is

Data security auditing is the process whereby the company data is analyzed, getting to know how the data works, and who accesses it.

Once all these are established, a strategy is then created to document the data. To determine security risks on data, it is vital to first understand how delicate the business information moves within the company, through the company, and how the said data moves out of the company.

Why Your Company Needs Data Security Auditing

Every firm is often at risk from lawsuits which contain delicate data like privacy, emails, and web content. If the company has sensitive information on the company’s computers, they are held accountable for the information if it gets lost, stolen, or misused.

Reviewing your company’s data security from time to time means that you should perform an investigative audit of the businesses’ data.

The process of securing a company’s data should be water-tight and leave no data security loopholes. This is also one of the provisions in the federal regulations, which further states that data destruction and/or recycling should also be secure.

The FACTA, GLBA, HIPAA, and Sarbanes-Oxley are some of the bodies whose mandate is to oversee the data lifecycle. They ensure that both low and high-profile companies securely store their data.

Why Data Auditing is Important

Ponemon Institute and Symantec carried out a research that revealed that up to 39% of data breaches were as a result of negligent insiders. 37% of the data breach incidents were done by hackers. The later would turn out to be among the costliest violation, which was thought to cost up to $222 for every document that got lost or stolen.

These days, many small but growing businesses are turning to information security plan to see that they are protected against data breach. In the same vein, insurance plans such as Cyber Liability insurance policies are becoming more adapted by businesses. Adequate data security plan calls for companies to perform frequent data security audits on their own.

It’s not enough to have a complete suite of defensive anti-data breach programs but you need to also ensure that these programs and procedures are quite effective. Ensure that the business technology infrastructure does not contain any weak spots. Ensure that the company’s software and hardware are properly configured and running how it’s designed to.

Educate your employees on how they can properly protect their client’s private data. All this can be achieved if regular data security audits are performed.

Common Open Source Security Tools You Should Try

  1. NMAP
    This security tool will help you to discover devices that are attached to your network. NMAP is not only a host discovery too, but also work as a port scanner and checks various services in its database comprising of more than 2200 popular services. It will help you understand what is on your network and if the device is using a vulnerable port, protocol, or service.
  2. Suricata
    This open source IDS/IPS is so scalable. It inspected multi-gigabit traffic and also checks for matches on threats, violations of policies, and malicious activities. The automatic protocol detection enables Suricata to scan for malware, and the command, and control channels.
  3. TrueCrypt
    Initially, this tool was free and was used for file encryption, partitioning or a storage device. In 2014, this product ceased being maintained. However, its open source code gave rise to two new security tools, namely CipherShed and VeraCrypt. These tools are based on the original product. TrueCrypt however, they have undergone the code upgrade.
  4. Wireshark
    Most security professionals and network administrators prefer Wireshark as their tool of choice. Wireshark is a free packet analyzer, a perfect tool for network troubleshooting and analyzing. Users can examine data on their network or from a file capture in real-time. The tool is handy when diagnosing security devices and also event logging. 

     

     

    Do you have unclear NTFS Permissions assignments?
    Do you have too many special permissions set on your fileservers?
    Or blocked NTFS Permission Inheritance?

    Protect yourself and your clients against security leaks and get your free trial of the easiest and fastest NTFS Permission Reporter now!

7 Best Practices in Managing NTFS Permissions

Whether you’re in the planning phase or have already implemented NTFS permissions, following some best practices ensure smooth administration and aid in resolving access issues quickly.

Here are seven practices we find effective in managing NTFS permissions.

#1 Grant Full Control on the Share and Specific NTFS Permissions on Folders

It’s a good practice to give “everyone” full control privileges on the Share Permission and then define specific permissions on the NTFS level—just as Microsoft has recommended it.

We’ve established that this is the best way of combining Share Permissions and NTFS Permissions.

You can visit this post to read more about it.

#2 Share folders with Groups not Users

This makes administration easier. Imagine sharing the “Sales” folder with 10 sales people.

Sounds Easy?

Okay, how about sharing it with 100 sales people?

Of course, the task is doable, but it would be a lot simpler if you just put them all in one group (such as  Sales Group), then share the folder with that group.

The same logic can be used when applying NTFS permissions.

#3 Organize your Resources

To ease administration, it’s important to keep application files and data files on their own individual folders. Furthermore, consolidating folders with the same security requirements will assist in managing their access rights.

For instance, if users require “Read” permissions for several application folders, store those folders within a single folder. This will allow you to grant the permission to that larger folder, instead of doing that for each application folder.

It’s also easier to manage the permissions of application or data folders when they are stored on their own, rather than when mixed with other file and data types.

Additionally, backups will also be less complex since you can choose which folders to backup without worrying if other file types will be included.

#4 Use “Read & Execute” for Application folders

When you assign permissions for working with application folders, assign the “Read & Execute” permission to the Users group and Administrators group.

Read & Execute” permits only viewing, accessing, and executing the file. This way, it’ll prevent application files from being accidentally deleted or damaged by users or viruses.

#5 Assign minimum permissions only

Assign minimum permissions that allow users to perform the required tasks.

For example, if a user needs to read information in a folder, and should never delete or create files, assign only the “Read” permission.

Doing so prevents unauthorized access to critical data, making your environment more secure.

In a complex environment, however, over-privileging can happen especially when users belong to multiple groups, causing users to have access they shouldn’t have.

By using tools such as FolderSecurityViewer or Effective Permission tool, you can examine and see the permissions each user has and act upon them accordingly.

#6 Use intuitive naming convention

Using intuitive share names allow users to easily recognize and locate resources. For example, for the Application folder, use “Apps” as the share name.

Although this is a basic practice, which is often ignored, following an intuitive naming convention can save you from unnecessary calls or emails from employees asking which one is the right folder.

Also, use share names that can be used across all client operating systems.

#7 Document everything

And we mean everything, even the slightest changes. It’s always good to have something to go back to when you forget who has access to what.

This not only serves as your guide but also as something you can share with other admins in your group to ensure everyone is on the same page.

Also, since changes in the organization are inevitable, whatever method you use for documentation, ensure it can easily be modified and expanded.

Useful Resources

Do you want to learn about NTFS Permissions and Share Permissions, and how to use them?

Grab your free course here (no signup, with downloadable eBooks):

Prevent Unauthorized Access to Sensitive Windows Folders!

Get your free edition of the easiest and fastest NTFS Permission Reporter now!

Planning before implementing NTFS Permissions

If you’re a Windows Administrator, you’ve probably experienced the nightmares in managing folder permissions. This is common in large or even small environment where no proper planning is made before giving the permissions. Such negligence could lead to complication and exposes the environment to security risk. Below are some examples:

  • Users or groups having access to folders not intended for them (e.g., Sales Group can view Management’s folders)
  • Applications fail to run because of lack of permission (e.g., Backup Software unable to perform tasks on specific folders)
  • Or just too convoluted folder permission that Admins are better off doing them from scratch.

Why Planning is a Crucial Step Before Implementing NTFS Permissions

All above examples are all due to incorrect planning (or the lack of it) before the implementation of NTFS permissions. One may point out that it can also be due incompetency of the person doing the task. I agree that could also happen, but if there is proper planning, documentation, and layout, these problems can be avoided even if you let your junior admin do the task.

As part of the Planning phase, here are some of the things an Admin can do:

Design a Folder Structure

Before creating the actual folders, you must know what folders are to be created. Whether you prefer digital or physical board, list the shares that will be created for each department or group. Work with the knowledge you already have of your current environment. There will be changes along the way (e.g. new department or new projects) but this would be a good start.

Identify who has access

After listing the shares to be created, map out the users or groups that have access to specific folders. You may List down the users or groups and draw a line to connect them to the appropriate shares. How ever you want this done, make sure to have fun doing it!

Plan the Permissions

This one is critical so take your time going through the shares and groups and write down the appropriate permission. If you use naming conventions such as R for Read-only or F for Full Control, make sure to be consistent to avoid confusion along the way.

Proper Documentation

A good planning always has good documentation. It’s always good to have something to go back to when you forget. This not only serves as your guide but something you can pass down to your junior staff or even to your boss. With that said, documentation must be clear and concise. Also, changes in the organization are inevitable so whatever method you used to document, make sure it can easily be modified and expanded.

Being an Admin can be stressful, but if you have proper planning, implementation, and clear documentation, it smoothens administration and helps you focus on other areas.

A more detailed guide on Planning and Managing NTFS Permissions can be found here (no signup, incl. free eBook):

Manage Data Access – Day 5: Tools, Quality Checks and Reporting

This content is password protected. To view it please enter your password below:

Manage Data Access – Day 4: Assignment of Active Directory Users to Security Groups

This content is password protected. To view it please enter your password below:

Manage Data Access – Day 3: Protection of Data with the Help of Active Directory Security Groups

This content is password protected. To view it please enter your password below:

Manage Data Access – Day 2: Definition of Business Processes and Responsibilities

This content is password protected. To view it please enter your password below:

Manage Data Access – Day 1: Designing Folder Structure and Policies for Permission Assignment

This content is password protected. To view it please enter your password below:

Top Ten Mistakes Made While Managing Data Access on Windows Fileservers

What are the top ten mistakes made while setting up and operate Windows fileservers? Here I’ll talk about them.

Read more