Unable to drop the index


Issue: There is a table with an index on several columns. The table is Schema bound and makes up part of an indexed view. User getting the below error

Drop failed for Index ‘UNIQUE_RECORD’.

Workaround: ALTER INDEX is the best option for this case. Modifies an existing table or view index (relational or XML) by disabling, rebuilding, or reorganizing the index; or by setting options on the index. We can disable it .DISABLE Marks the index as disabled and unavailable for use by the SQL Server 2005 Database Engine. Any index can be disabled. The index definition of a disabled index remains in the system catalog with no underlying index data. Disabling a clustered index prevents user access to the underlying table data. To enable an index, use ALTER INDEX REBUILD or CREATE INDEX WITH DROP_EXISTING. For more information, see Disabling Indexes.

Reference link:http://technet.microsoft.com/en-us/library/ms188388.aspx

Rollback of SSIS


Issue: I have a SSIS that insert a lot of comments that we received, because our pdn server is offsite, when I am loading the comments to pdn server, my concern is if the internet dropped, I need to roll back the transaction, is there a way to do this? or if there is a better way?

Workaround:

We can put the steps in the package inside of a transaction to roll back the entire transaction if any problem occurred …here is a video on how to do it….

http://www.jumpstarttv.com/Media.aspx?vid=39

To watch the video just need to register on the site and after confirmation you can watch it.

Note about Transaction:

Transactions in SSIS allow you to protect your data and envelope any data change that occurs in the package in that protection. Transactions can be either on the entire package, a container, or an individual task. They can be nested just like in TSQL. What is important to note about transactions is that only data-related tasks are protected. If your package were to archive a file, for example, that file would not be un-archived upon a package failure (unless you explicitly created some kind of compensating action to perform such duties).

Letus take the previous package you created and wrap the package in a transaction. First, lets disable the checkpoint from the previous example by changing the Checkpoint Usage package property to Never and the Save Checkpoints property to False. Next, delete all the records from the RestartabilityExample table.

With the example now reset back to its original state, you’re ready to enable transactions on the package. In the Properties window for the package, change the Transaction Option at the bottom of the window to Required. By changing this from Supported to Required, you have created a transaction that envelopes the package. Any task or container that has this same property set to Supported will join the transaction. By default, each task is set to join the transaction with the Transaction Option of Supported set.

For you to enable this type of transaction, though, you need the Microsoft Distributed Transaction Coordinator (MSDTC) started on each server that you want to participate in the transaction. So, if you have a server that changes data on two servers and runs on a third server, you would need MSDTC running on all three servers. Transactions can then protect data nearly from any database like SQL Server, Oracle, and UDB, since you have externalized the transaction-handling to another service. The caveat with this, though, is the database must be running on a Windows machine, since MSDTC is a Microsoft service.

Before executing a package that has transactions enabled, you must ensure that MSDTC is started. You can start the MSDTC service in the Services applet, or you can type the following command from a command prompt: NET START MSDTC. If you did not have the service started, you would receive the following error when executing the package:

Error: The SSIS Runtime has failed to start the distributed transaction due to error 0x8004D01B “The Transaction Manager is not available.”. The DTC transaction failed to start. This could occur because the MSDTC Service is not running. After having the service started, lets break the package yet again. Select the last task, which is named Insert Record 2, and change the ForceExecutionResult package property to Failure. Execute the package again and you should see the results as shown in Figure 7-11. This time, if you were to view the records in the Adventure Works database, you should see no new rows. This is because the purging of the table occurred, then the first record was inserted, and when the final task failed, both it and the first two operations (within the same transaction) were rolled back.

Send email using SQL Mail without specifying a profile


Issue: User is experimenting with sending email via sp_send_dbmail. If he set the @profile_name to NULL I get the following error:
No global profile is configured. Specify a profile name in the @profile_name parameter.
So, he would like to send email using SQL Mail without specifying a profile while at the same time passing in the @from_address?
Workaround:
As per the problem posted two thoughts
Frist Thought:
When @profile is not specified, sp_send_dbmail uses a default profile. If the user sending the e-mail message has a default private profile, Database Mail uses that profile. If the user has no default private profile, sp_send_dbmail uses the default public profile. If there is no default private profile for the user and no default public profile, sp_send_dbmail returns an error.

sp_send_dbmail does not support e-mail messages with no content. To send an e-mail message, you must specify at least one of @body, @query, @file_attachments, or @subject. Otherwise, sp_send_dbmail returns an error.
Database Mail uses the Microsoft Windows security context of the current user to control access to files. Therefore, users who are authenticated with SQL Server Authentication cannot attach files using @file_attachments. Windows does not allow SQL Server to provide credentials from a remote computer to another remote computer. Therefore, Database Mail may not be able to attach files from a network share in cases where the command is run from a computer other than the computer that SQL Server runs on.
If both @query and @file_attachments are specified and the file cannot be found, the query is still executed but the e-mail is not sent.
When a query is specified, the result set is formatted as inline text. Binary data in the result is sent in hexadecimal format.
The parameters @recipients, @copy_recipients, and @blind_copy_recipients are semicolon-delimited lists of e-mail addresses. At least one of these parameters must be provided, or sp_send_dbmail returns an error.
Reference Link:http://technet.microsoft.com/en-us/library/ms190307.aspx

Second thought:
How to configure SQL Server Database Mail to send email using your Windows Live Mail Account or your GMail Account–http://blogs.msdn.com/b/suhde/archive/2009/07/12/how-to-configure-sql-server-database-mail-to-send-email-using-your-windows-live-mail-account-or-your-gmail-account.aspx

Cannot access the SQL server database using Windows Authentication (on a domain)once BitLocker ON


Issue: Client are trying to use Bitlocker to secure an SQL 2012 Standard installation, but whenever they turn BitLocker ON, they cannot access the database using Windows Authentication (on a domain).
Workaround: You need to use the SQL server service account to enable Bitlocker and initialize the TPM

Installation and initialization

BitLocker is installed automatically as part of the operating system installation. However, BitLocker is not enabled until it is turned on by using the BitLocker setup wizard, which can be accessed from either the Control Panel or by right-clicking the drive in Windows Explorer.

At any time after installation and initial operating system setup, the system administrator can use the BitLocker setup wizard to initialize BitLocker. There are two steps in the initialization process:

1. On computers that have a TPM, initialize the TPM by using the TPM Initialization Wizard, the BitLocker Drive Encryption item in Control Panel, or by running a script designed to initialize it.

2. Set up BitLocker. Access the BitLocker setup wizard from the Control Panel, which guides you through setup and presents advanced authentication options.

When a SQL service account initializes BitLocker, the administrator should also create a recovery password or a recovery key. Without a recovery key or recovery password, all data on the encrypted drive may be inaccessible and unrecoverable if there is a problem with the BitLocker-protected drive.

Reference link:http://technet.microsoft.com/en-us/library/cc732774.aspx

Could not delete the subscription at Subscriber


Issue: My server “SVR1” runs a SQL Server with a transactional publication. On the second server “SVR2” I had a subscription.
Now the SVR2 does not exist anymore and I want to delete the replication on SVR1. (in the SQL Management Studio, the subscription is still visible)

But if I want to delete the SVR2-subscription on the SVR1, I get the error message 20032 “Could not delete the subscription at Subscriber ‘SVR2’ in database ‘ICSDB’.”How can I delete this subscription (and afterwards also the publication)

I’m using SQL Server 2008 R2 on Windows Server 2008R2

Workaround:

Right click on the publication; go to properties then to subscriptions. Click Delete. This is the EM way. For the QA way, generate the delete statement from EM.

For future use, I was having the same problems described above – but I also had database snapshots on the database that I was trying to remove replication from but that was not mentioned anywhere in the error messages.

Reference Link:http://social.msdn.microsoft.com/Forums/sqlserver/en-US/b64bcde9-4769-45df-a8c7-19606c9c33ca/unable-to-drop-subscription?forum=sqlreplication

Once I removed my database snapshots, the following script cleans things up well:

Example came from here: http://msdn.microsoft.com/en-us/library/ms188411.aspx

DECLARE @distributionDB AS sysname;
DECLARE @publisher AS sysname;
DECLARE @publicationDB as sysname;
SET @distributionDB = N’distribution’;
SET @publisher = N’S-ESN-SQL1\SQL1′;
SET @publicationDB = N’AdventureWorks2008R2′;

— Disable the publication database.
USE [AdventureWorks2008R2]
EXEC sp_removedbreplication @publicationDB;

— Remove the registration of the local Publisher at the Distributor.
USE master
EXEC sp_dropdistpublisher @publisher;

— Delete the distribution database.
EXEC sp_dropdistributiondb @distributionDB;

— Remove the local server as a Distributor.
EXEC sp_dropdistributor;
GO

The easiest way to recover of your replication


Issue: User has database replication from one server to another over a rather thin link. They are out of sync so he would prefer to re-snapshot. Unfortunately the link is thin and performing a snapshot would incur huge transfer delays. He would not like to perform a snapshot but create a backup that can be hand carried. He would like to do ASAP on exiting replication model

Workaround:

Once you have replication configured, you should be able to re-create it easily in case of a disaster or if you want to deploy it to a client’s location. In Enterprise Manager, the easiest way to generate a SQL Script for replication is to right-click on the publication in the Replication group and selects Generate SQL Script. You are asked if you’d like to create a script for the removal of replication or the creation of replication. The script does the following:
• Set up the database for replication
• Create the publication
• Assign the appropriate permissions to the publication
• Add articles to the publication
• Create the subscription

If a disaster occurs on your publisher or subscriber servers, the easiest way to recover is to use the following steps:

1. Unsubscribe from the publisher.
2. Restore the most recent backup or backups.
3. Subscribe to the publisher.
4. Resynchronize the system.

Problem loading RDL file in SSRS 2005


Issue: Problem loading RDL file in SSRS 2005.When User open a up VS Studio  2005 and double click on a report to open it. He got a dialog box to display some tmp file and on the main screen and  see a message displaying:

“Navigation to the webpage was canceled”

Workaround:

Based on my knowledge, your problem was because IE settings or network settings.

Then I found related explanation on this page  http://support.microsoft.com/kb/967941

“After you install security update 961260 (MS09-002) on a computer that is running Windows Vista or Windows Server 2008, you may experience a problem when you visit some Web pages that are in different Internet Explorer security zones. When the problem occurs, you receive a message that resembles the following:

Navigation to the webpage was canceled
This problem occurs if the navigation causes Protected Mode settings to change from On to Off. You can determine which Internet Explorer security zone a Web page is in by looking at the status bar at the bottom of the Internet Explorer 7 window.

So please show us your IE version, we may fix this by solution listed on that page.

For example,
“Undo workaround method 3: Disable Protected Mode for the “Trusted Sites” zone

1.Click Start

Collapse this imageExpand this image

Start button , type Internet Options in the Start Search box, and then click Internet Options in the Programs list.
2.Click the Security tab, and then click Trusted Sites in the Select a zone to view or change security settings area.
3.Click to clear the Enabled Protected Mode (requires restarting Internet Explorer) check box, and then click OK.
4.Restart Internet Explorer.

Reference Link: http://social.msdn.microsoft.com/Forums/vstudio/en-US/84eb77b9-f7f9-4fbf-89e1-69c7372dccd0/visual-studio-2010-professional-installer-navigation-to-the-webpage-was-canceled

Best Methods to Encrypt Functions and Stored Procedures.: New blog doc : SQL server Administration


Best Methods to Encrypt Functions and Stored Procedures.

Using Signatures on Procedures to Grant Permissions

A new—and, in my experience, little known—feature in SQL Server 2005 is the use of a signing certificate to sign stored procedures, assemblies, views, and functions. Administrators can assign permissions to the signing certificate itself and can then be sure that the permissions granted cannot be inadvertently modified by changing the stored procedure, assembly, and so on. Signed executable code cannot be changed without breaking the signature, which consequently invalidates any permissions granted to the signature—that is, unless the changed procedure, assembly, view, or function is re-signed with that signature.

The syntax for adding a signature is simple and includes the ability to add an existing signature (presumably created on another server) with a certificate containing only a public key and to specify a password if the certificate is password protected:

ADD SIGNATURE TO BY CERTIFICATE

[{ WITH PASSWORD = ‘password’ | WITH SIGNATURE=binary_signature}]

The need for signing code may seem rather exotic. In most production environments, administrators do not need to worry about someone other than administrators changing procedures. However, there are important scenarios where this could be very useful:

• A signature enables a low-privileged account to execute a specific, pre-approved task that the account would not otherwise have permissions to perform.

• ISVs may find signing certificates useful for ensuring that the stored procedures, assemblies, and so on, that are released with their product are not changed.

• Signatures solve a difficult problem with cross-database authentication.

The ability to deploy signed stored procedures, assemblies, functions, and views can allow an ISV to effectively prevent code tampering by a customer’s database administrator. This can help prevent local changes that would otherwise become support issues. Signed procedures are easy to include in any deployment scenario: the signatures can be backed up and restored, attached, or scripted as a blob with the ADD SIGNATURE Transact-SQL statement. Technically, a system administrator may be able to temporarily work around this by implementing his or her own certificate of the same name and signing all of the same procedures, but this would not be trivial. It would also be detectable by the ISV.

The real power of signatures lies in cases where data for a particular query is spread among multiple databases on the server instance and there are different security models (or database owners) among the databases.

One way to address this is to ensure that the server logins are mapped as users in both databases and that the users have valid permissions in both databases. Every user granted permission to execute a query in the first database must also be granted appropriate permissions in the second database. Needless to say, this can quickly become unsupportable, especially when the databases are owned by different groups.

Reference article: http://technet.microsoft.com/en-us/library/cc837966(v=sql.100).aspx
http://www.codeproject.com/Articles/38583/How-Do-I-Protect-My-Stored-Procedure-Code

Performance boost up by Optimize for ad hoc workloads


Performance boost up by Optimize for ad hoc workloads

I have faced a serious issue where the sales calculation is taking more than 7 hours whereas as per IBM it should be completed within 3 hours. The client is using Varicent sales application ( IBM software)to make the sales report. We are using 64 GB RAM and 8 core processor but still sql server faced the memory pressure. I am start to investigate the memory pressure and observed the cache memory issue. As per the application tendency it is using the lots ad hoc query. I am resolve the issue after enable the optimize for ad hoc workloads and time is comes down to 2 hours 30 minute. I have mentioned the details of optimize for ad hoc workloads option so we can determine how is it help us to improve the performance.

In SQL Server 2008 and later, we can use optimize for ad hoc workloads, which is a server-level setting, i.e. it will affect every single database on the server (as opposed to forced parameterization, which is database-specific).

With this setting enabled, SQL Server does not cache the plan of an ad hoc query the first time it is encountered. Instead, a plan-stub is cached that just indicates that the query has been seen before. Only if the query is seen a second time is the plan cached. This won’t reduce the number of compiles for ad hoc queries, but it will make it less likely that the plan cache will grow as much, since the initial stub takes up very little memory. As such,
it reduces the chances that other plans which could be reusable will be discarded due to memory pressure.

To enable the optimize for ad hoc workloads setting, use sp_configure, as shown

EXEC sp_configure ‘show advanced options’,1
RECONFIGURE
EXEC sp_configure ‘optimize for ad hoc workloads’,1
RECONFIGURE