Showing posts with label single. Show all posts
Showing posts with label single. Show all posts

Friday, March 30, 2012

How to INSERT text that contains single quotes?

When users enter text into a textbox, to be INSERTed into my table, SQL Server throws an error if their text contains a single quote.

For example, if they enter "It's great!" then it causes this error:
Error: Incorrect syntax near 's'. Unclosed quotation mark after the character string ''.

How can I allow text with single quotes to be inserted into the table?

Here's my code:

string strInsert = "INSERT INTO [Comments] ([GameID], [UserID], [Comment]) VALUES (@.GameID, @.UserID, @.Comment)";

SqlConnection myConnection = new SqlConnection(<<myconnectionstuff>>);
SqlCommand myCommand = new SqlCommand(strInsert, myConnection);

myCommand.Parameters.Add( "@.GameID", Request.QueryString["GameID"] );
myCommand.Parameters.Add( "@.UserID", (string)Session["UserID"] );
myCommand.Parameters.Add( "@.Comment", ThisUserCommentTextBox.Text );


try {
myCommand.Connection.Open();
myCommand.ExecuteNonQuery();
}

catch (SqlException ex) {
ErrorLabel.Text = "Error: " + ex.Message;
}

finally {
myCommand.Connection.Close();
}


You write C# so they are called string literals in C# but in ANSI SQL they are Delimiters per ANSI SQL 92 and SQL Server is compliant so run a search for Qouted Identifier in SQL Server BOL(books online) and try the link below for SQL Server Delimiters. Hope this helps.

http://msdn2.microsoft.com/en-gb/library/ms176027.aspx

|||That code looks fine. Are you sure that is where the problem is?|||

(How can I allow text with single quotes to be inserted into the table?)

The person have not posted the code with the error but I replied the above and the error generated.

|||

Motley:

That code looks fine. Are you sure that is where the problem is?

Well it works fine unless I include a single quote in the text box somewhere.

Caddre, I've researched some more based on your reply, but I don't have easy access to the SQL Server settings - I'm on a shared hosting account. Is there some other workaround, maybe using a regular expression to replace a single quote with something else, at least temprarily while I'm processing the text? I tried entering /' instead, to see if an escape character would work, but SQL Server still won't take it.

|||


Yes there is SET QUOTED_IDENTIFIER ON/OFF will help you deal with it. Try the link below for details.

http://msdn2.microsoft.com/en-US/library/ms174393.aspx

|||

Caddre:

Yes there is SET QUOTED_IDENTIFIER ON/OFF will help you deal with it. Try the link below for details.

http://msdn2.microsoft.com/en-US/library/ms174393.aspx

I'm wondering how to send the "SET QUOTED_IDENTIFIER ON" command within my ASP.NET page.


I'm using a command called ExecuteNonQuery(); -- so does that mean I can just append it to the beginning of my INSERT string, like this?

string strInsert = "SET QUOTED_IDENTIFIER ON GO INSERT INTO [Comments] ([GameID], [UserID], [Comment]) VALUES ..."


Does it last only during this query or will it stay on somehow until I turn it off?


Sorry, I'm a newbie with SQL Server and ASP.NET.


|||You have to use it in your stored proc and ExecuteNonQuery takes stored proc. Hope this helps|||

Yes, I understand. It should work fine for any value you place within the textbox control. That includes quotes. I do this quite often with no problems.

As for can you put the SET statement before your query, the answer is yes, but do not separate the commands with GO. You may place a semi-colon between them if you want to however, like:

SET {stuff}; SELECT {stuff}

|||

OK, I am trying to make sense of this, and it's just not making any sense to me. (I have limited experience with SQL and SQL Server and VBScript; I paid a programmer to write a script and after delivering a partly-working script, he promised to fix it and then disappeared. I think he couldn't solve this problem himself.)

I am getting the error message "Unclosed quotation mark after the character string" when I execute this SQL statement from VBScript, butno error when I execute the same SQL statement from within SQL Management Studio. Any suggestions would be extremely welcome.

INSERT INTO Raw_Datafeeds (ProductID,Name,MerchantID,Merchant,Link,Thumbnail,BigImage,Price,RetailPrice,Category,SubCategory,Description,Custom1,Custom2,Custom3,Custom4,Custom5,LastUpdated,Status,Last_Loaded)
VALUES
('454848129','Healthy Ears Ear Cleanser - Buy 2 Get 1 Free','10052',
'Pet Nutrition Products','http://www.shareasale.com/m-pr.cfm?merchantID=10052&userID=YOURUSERID&productID=454848129',
'http://www.petnutritionproducts.com/images/products/100/cat_ears.jpg',
'http://www.petnutritionproducts.com/images/products/400/cat_ears.jpg',15.98,0,'Home/Family','Pets',
'Healthy Ears Ear Cleanser is a veterinarian formulated combination of gentle ingredients that aid in the removal of dirt, wax and other unwanted matter from your cat''s ears.',
'Dog','Ear Care',' ',' ',' ','6/6/2006 1:35:01 AM','instock','10/25/2006 4:06:00 PM')

|||

The original problem in this thread could probably be solved by specifying the type of each parameter rather than letting the system guess.

As for mark's problem, it's obviously related to "cat''s" in your insert string, however since you haven't posted the code you are using in VBScript, it'd be difficult to say where the problem is. As a quick hack, you can replace all quotes in any parameter with '+CHAR(39)+' so that cat's would then become cat'+CHAR(39)+'s.

how to insert records from tiggers at batch update process

I want to split the records from table1 and insert the splited records to table2. The trigger is working fine when i am trying to update single record. but the trigger is not working properly at the time of batch update. At the time of batch update it is split the last record from the query and insert that into table2 instead of inserting all the records.

Pls help me........I suspect you wrote your trigger to work with one record at a time. I say this because of your comment of the trigger working for a one record update and the LAST record in a batch. Take a look at your trigger and ask yourself "What happens if I have three records to deal with in either the inserted OR deleted temp tables?"

Post your trigger and maybe we can offer some suggestions.|||Hi

This is the trigger i have used to split the records for various conditions from table1 and insert the splitted records into another table table2

The Problem i'm facing is if it is single update(if only one row is affected) the trigger is working fine,
but not for batch update.

if you are unable to follow this trigger pls give your own batch update example, we will try to implement that.

pls help me.

CREATE trigger udt_Break_Split
ON Hours_tmp
/*
this trigger is used to do the break split in the table hours_tmp
and insert the records into the table selnmuamid

*/

AFTER UPDATE

AS
Declare @.Id int
Declare @.Res_id nvarchar(8)
Declare @.DTime smalldatetime
Declare @.StartTime char(5)
Declare @.EndTime char(5)
Declare @.Pauze char(5)
Declare @.Hourworked nvarchar(5)
Declare @.Status char(1)
--Declare @.PPayroll bit
Declare @.Project nvarchar(10)
Declare @.Naam char(50)
Declare @.Type char(5)
Declare @.Hryear nvarchar(4)
Declare @.Totmin char(5)
Declare @.Endtimebr char(5)
Declare @.Stime char(5)
Declare @.Start char(5)
Declare @.Processed char(1)
Declare @.del_DTime smalldatetime
Declare @.del_StartTime char(5)
Declare @.del_EndTime char(5)
Declare @.Del_id int
Declare @.Totrows int

--update hours_tmp set hours_tmp.processed_payroll =1 from hours_tmp , inserted where hours_tmp.id = inserted.id

Select @.Id=Id,
@.Res_id=Res_id,
@.DTime=[Datetime],
@.Start=Starttime,
@.EndTime=Endtime,
@.Pauze=Pauze,
@.Hourworked=Hoursworked,
@.Status=Status,
@.Project=Project,
@.Naam=Naam,
@.Type=Type,
@.Hryear=Hryear,
@.Processed=Processed
From inserted -- i ,Hours_tmp h where i.[Id] = h.[id]

Select @.del_DTime=[DateTime],
@.del_Starttime=Starttime,
@.del_endTime=Endtime ,
@.del_id=id
From Deleted

Delete From Selnmuamid
Where Res_id=@.Res_id
and Datetime=@.del_DTime
and Starttime >=@.del_starttime
and Endtime <=@.del_endtime

-- This is To Get total hours (hourworked column in the table ) in minutes

Select @.Totmin=Sum(Convert(Integer,(Left(@.hourworked,Char index('.',@.hourworked)-1) *
60)) +Convert(Integer,(Right(@.hourworked,Len(@.hourworke d) -
Charindex('.',@.hourworked))))),@.Endtimebr=Sum(Conv ert(Integer,(Left(@.Endtime,Charindex(':',@.Endtime) -1) *
60)) + Convert(Integer,(Right(@.Endtime,Len(@.Endtime) -
Charindex(':',@.Endtime))))),@.Stime=Sum(Convert(Int eger,(Left(@.Start,Charindex(':',@.Start)-1) * 60)) +
Convert(Integer,(Right(@.Start,Len(@.Start) - Charindex(':',@.Start)))))

--Values Passed as Parameter to Stored procedure Break_Split

Exec Break_Split @.Id,@.Res_id,@.DTime,@.Start,@.EndTime,@.Pauze,
@.Hourworked,@.Status,@.Project,@.Naam,@.Type,
@.Hryear,@.Totmin,@.Endtimebr,@.STime,@.Processed|||basically I just wrapped your code in a while loop to step through each record in the inserted temp table. Then I moveded the delete Selnmuamid to the end.

This has NOT been tested!

Code:
------------------------------

CREATE trigger udt_Break_Split
ON Hours_tmp
/*
this trigger is used to do the break split in the table hours_tmp
and insert the records into the table selnmuamid

*/

AFTER UPDATE

AS

-- ----------------
-- One Declare is more efficient than multipule declares
-- ----------------
Declare @.Id int
, @.Res_id nvarchar(8)
, @.DTime smalldatetime
, @.StartTime char(5)
, @.EndTime char(5)
, @.Pauze char(5)
, @.Hourworked nvarchar(5)
, @.Status char(1)
-- , @.PPayroll bit
, @.Project nvarchar(10)
, @.Naam char(50)
, @.Type char(5)
, @.Hryear nvarchar(4)
, @.Totmin char(5)
, @.Endtimebr char(5)
, @.Stime char(5)
, @.Start char(5)
, @.Processed char(1)
, @.del_DTime smalldatetime
, @.del_StartTime char(5)
, @.del_EndTime char(5)
, @.Del_id int
, @.Totrows int

--update hours_tmp set hours_tmp.processed_payroll =1 from hours_tmp , inserted where hours_tmp.id = inserted.id

select @.Id = min(ID) from inserted
while (@.Id is not null) begin
Select @.Res_id = Res_id
, @.DTime = [Datetime]
, @.Start = Starttime
, @.EndTime = Endtime
, @.Pauze = Pauze
, @.Hourworked = Hoursworked
, @.Status = Status
, @.Project = Project
, @.Naam = Naam
, @.Type = Type
, @.Hryear = Hryear
, @.Processed = Processed
-- ---------------------------------------------------
-- Hoursworked, Endtime and StartTime need to be converted to minutes
-- ---------------------------------------------------
, @.Totmin = Convert(Integer,(Left(Hoursworked,Charindex('.',Ho ursworked)-1) * 60)) + Convert(Integer,(Right(Hoursworked,Len(Hoursworked ) - Charindex('. ',Hoursworked))))
, @.Endtimebr = Convert(Integer,(Left(Endtime, Charindex(':',Endtime)-1) * 60)) + Convert(Integer,(Right(Endtime, Len(Endtime) - Charindex(':',Endtime))))
, @.Stime = Convert(Integer,(Left(Starttime, Charindex(':',Starttime)-1) * 60)) + Convert(Integer,(Right(Starttime, Len(Starttime) - Charindex(':',Starttime)))))
From inserted i
where id = @.id


-- This is To Get total hours (hourworked column in the table ) in minutes
/*
Select @.Totmin = Sum(Convert(Integer,(Left(@.hourworked,Charindex('. ',@.hourworked)-1) * 60)) +
Convert(Integer,(Right(@.hourworked,Len(@.hourworked ) - Charindex('. ',@.hourworked)))))
, @.Endtimebr = Sum(Convert(Integer,(Left(@.Endtime,Charindex(':',@. Endtime)-1) * 60)) +
Convert(Integer,(Right(@.Endtime,Len(@.Endtime) - Charindex(':',@.Endtime)))))
, @.Stime = Sum(Convert(Integer,(Left(@.Start,Charindex(':',@.St art)-1) * 60)) +
Convert(Integer,(Right(@.Start,Len(@.Start) - Charindex(':',@.Start)))))
*/
-- Values Passed as Parameter to Stored procedure Break_Split

Exec Break_Split @.Id,@.Res_id,@.DTime,@.Start,@.EndTime,@.Pauze,@.Hourwor ked,@.Status,@.Project,@.Naam,@.Type,@.Hryear,@.Totmin,@. Endtimebr,@.STime,@.Processed

select @.Id = min(ID) from inserted where id > @.id
end

-- -------------------
-- The deleted temp table can be joined to the Selnmuamid table
-- to deleted the unwanted records. If you need the ID from the
-- inserted table then join the inserted table with the
-- deleted table and then join the Selnmuamid table.
-- This is a good example of how you can ditch the row-at-a-time
-- mentality and start using set theory for processing records.
-- -------------------
delete
from Selnmuamid s
join deleted d on s.Res_id = d.Res_id
where [s.Datetime] = d.del_DTime
and s.Starttime >= d.del_starttime
and s.Endtime <= d.del_endtime

/*
Select @.del_DTime = [DateTime]
, @.del_Starttime = Starttime
, @.del_endTime = Endtime
, @.del_id = [id]
From Deleted
where Id = @.Id

Delete
From Selnmuamid
Where Res_id = @.Res_id
and [Datetime] = @.del_DTime
and Starttime >= @.del_starttime
and Endtime <= @.del_endtime
*/

------------------------------

How to insert multiple values to a column at a time?

Hi all,

I was looking to insert multiple values to a single column at a time in SQl.

Any help pleas!

Thank you

Ephi:

What exactly do you mean by "multiple values"? Different data types? A vector? An array? What exactly.


Dave

|||

Lets say I have a single Column called X and I want to insert multiple values into X like (1,2,3) at a time through using the insert statement.

Thank you in advance.

|||Hi,

In reality it is not acceptable according to data normalization rules.
I.e. if you want to have such behavior you should better create an additional table and have foreign constraints mapping to it.
Some example:

table_x(
field_1 .....,
field_2 .....,
field_in_which_you_want_to_have_multiple_values.....
);

table_y(
value_identifier .....,
value nvarchar(1024) ....
);

So you firstly insert several values to table 'table_y' and then just add
value_identifier to a 'table_x.field_in_which_you_want_to_have_multiple_values'|||

do you mean you want to create multiple rows, with one row for each value, using a single Insert statement?

This is not possible. Insert only creates one row in the table.

Unless, of course, you are inserting into one table using the values from another table, in which case you can use the insert...select syntax.

Normal insert syntax:

insert table_name
(col1, col2, col3)
values
(val1, val2, val3)

only inserts one row.

|||I would recommend you use Itzik Ben-Gan's Split function:

CREATE FUNCTION dbo.fn_SplitTSQL
(@.arr NVARCHAR(MAX), @.separator NVARCHAR(1) = N',') RETURNS TABLE
AS
RETURN
SELECT
n - LEN(REPLACE(LEFT(@.arr, n), @.separator, '')) + 1 AS pos,
SUBSTRING(@.arr, n,
CHARINDEX(@.separator, @.arr + @.separator, n) - n) AS element
FROM dbo.Nums
WHERE n <= LEN(@.arr)
AND SUBSTRING(@.separator + @.arr, n, 1) = @.separator;Once you've got this, you could do something like:

insert into mytable (col1, col2, col3)
select 'Val1', 'Val2', element
from dbo.fn_SplitTSQL(N'1,2,3',N',')

This should handle it nicely for you. Oh yes, and you'll need a table called Nums with a field called 'n', which you have populated from 1 to some arbitrarily large number. 1000 might be big enough for most of your uses...

There's more on this at:
http://www.sql.co.il/books/insidetsql2005/source_code/TechEd%202006%20Israel%20-%20Advanced%20T-SQL%20Techniques.txt

Robsql

How to insert multiple values in a single column

I need to display output as shown below. In Col1, Col2 and Col3 I want to insert values from subqueries. But do not want to use sub-reports... is there any alternative to subqueries.

In Col1, Col2 and Col3 there can be any number of values.

Company

BankCol1Col2Col3AstroTechICICI

123

5

34

MindTreeHDFC

54

8

why don't you want to use a sub-report?

|||

It takes very long time, because i have 17 columns in the report.. its not fisible to insert 15 sub-reports

|||

Ah I understand, and agree. Not sure how else to implement this though...could you insert another table into your main table's cells?

|||

Not resolved yet... any ideas...

|||

One idea would be to build your data using a view. Capture the main data (Company, Bank, ID) and then capture each columns' data (based on the ID in first query). Select from the view.

I think your data would be more like:

ID Company Bank ManagerNames BranchOffices TopSalesNames

1 CompanyA BankOne Jerry, Ted, Lisa Omaha, Chicago Fred, Mary

2 CompanyB BankTwo Paul Lincoln, Springfield, Florence William, John

Not sure if this meets your needs. You may be able to format the multi-value rows once in the report. Hope this helps.

|||

thanks... I will try this

How to insert multiple rows using stored procedure

How to insert multiple rows with using a single stored procedure and favourably as an atomic process?You need to inlude SAVE POINT in your T-SQL code so a rollback or interruption will not take the Transaction back to the beginnning. Like the sample below. Hope this helps.

SAVE TRANSACTION SavepointName
IF @.@.error= some Error
BEGIN
ROLLBACK TRANSACTION SavepointName
COMMIT TRANSACTION
END

Kind regards,
Gift Peddie|||You have a couple of options here:

1) created a delimited key,value pair and parse it in the proc
2) package the values as an xml chunk and use OPENXML to shred the doc and perform the insert

I prefer option 2.

And with regards to atomicity, you wrap 1 or 2 in a BEGIN TRAN, COMMIT or ABORT in the proc.|||Since I never used OPEN XML can you give me a link to a good tutorial how to pass xml from .net code to sql sp..

Thanks|||Have a look at the following article:

Decomposing with OpenXML

Wednesday, March 28, 2012

How to insert data with single qoute?

Hi guys, Anyone of you kindly reply on this.

Thanks guys,

Make the single quote as double-single-quotes:
select 'Chris''s'
will select, Chris's
Hope this helps,
-chris|||Thanks a lot Chris...........|||

Hi emkcah,

Chris suggestion is valid, but beware if the character data has two (or more) singe quotes as this will then fail (as in the string You're at Chris's). Also, the suggestion is erroneous as your souce data is in essence now corrupt and no longer portable.

In this scenario, you should:

SET QUOTED_IDENTIFIER OFF

GO

SELECT "You're at Chris's"

Cheers

Rob

|||Thanks Robert Smile|||

You can use a function to check user input for a "single quote" and add the second "single quote" if you're adding records based on user input into a form. The following function changes a last name such as "O'Day" to "O''Day".

Function call:

lastname = fixapostrophe(Server.HtmlEncode(lastname))

Function:

Private Function fixapostrophe(ByVal newText As String) As String
Dim apostrophe As String = "'"
Dim pos As Integer
Dim start As Integer = 1
Do While InStr(start, newText, apostrophe) > 0
pos = InStr(start, newText, apostrophe)
newText = newText.Insert(pos, apostrophe)
start = pos + 2
Loop
Return newText
End Function

It seems to work OK, but you should check with a real programmer before using it.

|||There's a much easier solution, just use the replace function :
select (replace,"'","`",field1) from tablename
eg insert into [destination table] select replace("'","`",fieldname) from [sourcetable]
look it up on books on line..|||

Hello,

Using Replace will corrupt your souce data. If the string "you're" is passed to the app, it should remain as "you're" and stored as such.

Cheers

Rob

How to insert data with single qoute?

Hi guys, Anyone of you kindly reply on this.

Thanks guys,

Make the single quote as double-single-quotes:
select 'Chris''s'
will select, Chris's
Hope this helps,
-chris|||Thanks a lot Chris...........|||

Hi emkcah,

Chris suggestion is valid, but beware if the character data has two (or more) singe quotes as this will then fail (as in the string You're at Chris's). Also, the suggestion is erroneous as your souce data is in essence now corrupt and no longer portable.

In this scenario, you should:

SET QUOTED_IDENTIFIER OFF

GO

SELECT "You're at Chris's"

Cheers

Rob

|||Thanks Robert Smile|||

You can use a function to check user input for a "single quote" and add the second "single quote" if you're adding records based on user input into a form. The following function changes a last name such as "O'Day" to "O''Day".

Function call:

lastname = fixapostrophe(Server.HtmlEncode(lastname))

Function:

Private Function fixapostrophe(ByVal newText As String) As String
Dim apostrophe As String = "'"
Dim pos As Integer
Dim start As Integer = 1
Do While InStr(start, newText, apostrophe) > 0
pos = InStr(start, newText, apostrophe)
newText = newText.Insert(pos, apostrophe)
start = pos + 2
Loop
Return newText
End Function

It seems to work OK, but you should check with a real programmer before using it.

|||There's a much easier solution, just use the replace function :
select (replace,"'","`",field1) from tablename
eg insert into [destination table] select replace("'","`",fieldname) from [sourcetable]
look it up on books on line..|||

Hello,

Using Replace will corrupt your souce data. If the string "you're" is passed to the app, it should remain as "you're" and stored as such.

Cheers

Rob

sql

How to insert data with single qoute?

Hi guys, Anyone of you kindly reply on this.

Thanks guys,

Make the single quote as double-single-quotes:
select 'Chris''s'
will select, Chris's
Hope this helps,
-chris|||Thanks a lot Chris...........|||

Hi emkcah,

Chris suggestion is valid, but beware if the character data has two (or more) singe quotes as this will then fail (as in the string You're at Chris's). Also, the suggestion is erroneous as your souce data is in essence now corrupt and no longer portable.

In this scenario, you should:

SET QUOTED_IDENTIFIER OFF

GO

SELECT "You're at Chris's"

Cheers

Rob

|||Thanks Robert Smile|||

You can use a function to check user input for a "single quote" and add the second "single quote" if you're adding records based on user input into a form. The following function changes a last name such as "O'Day" to "O''Day".

Function call:

lastname = fixapostrophe(Server.HtmlEncode(lastname))

Function:

Private Function fixapostrophe(ByVal newText As String) As String
Dim apostrophe As String = "'"
Dim pos As Integer
Dim start As Integer = 1
Do While InStr(start, newText, apostrophe) > 0
pos = InStr(start, newText, apostrophe)
newText = newText.Insert(pos, apostrophe)
start = pos + 2
Loop
Return newText
End Function

It seems to work OK, but you should check with a real programmer before using it.

|||There's a much easier solution, just use the replace function :
select (replace,"'","`",field1) from tablename
eg insert into [destination table] select replace("'","`",fieldname) from [sourcetable]
look it up on books on line..

|||

Hello,

Using Replace will corrupt your souce data. If the string "you're" is passed to the app, it should remain as "you're" and stored as such.

Cheers

Rob

Friday, March 23, 2012

how to index large varchar field

Here's the situation: we have a character based field of up to 2048
characters in length that we want to store in a single column in a
sqlserver table. need to be able to search on this field and guarantee
uniqueness. it would be nice if this field has an index with a unique
constraint. the problem is that sqlserver can't index anything over
900 characters in length.
any suggestions appreciated.
we have a couple thoughts so far:
1 - compress this field in the data access layer before saving in sql -
that should keep the length of the field under 900 chars and let us
apply a unique index in sql. however, we would still have the
uncompressed, unindexed version of the field in another column
searching.
2 - split the field into multiple columns each under 900 characters.
apply an index to each column. however, we'd lose the uniqueness
constraint and this would complicate searches.
3 - combination of 1 and 2.
by the way - we're using SQL Server 2005.
In an effort to maintain uniqueness you could investigate creating three
additional columns (varchar (683)) then popultate the data from the field in
question evenly across the new columns. To ensure uniqueness you could
created a concatenated key with the three new columns. Just a thought....
Thomas
"steve.c.thompson@.gmail.com" wrote:

> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.
>
|||I was thinking the same thing Thomas. I created the three additional
columns and the tried to create the composity key, but sql won't create
an index across the three fields since the sum of the field lengths is
greater then 900.
steve.
|||steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and
> guarantee uniqueness. it would be nice if this field has an index
> with a unique constraint. the problem is that sqlserver can't index
> anything over 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql
> - that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
Compression means you're dealing with binary data and you're not going
to get a lot of compression on a max of 2K worth of data.

> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
Adds additional management.
I would recommend that you hash the data and generate a 32-bit hash
value. You can do this from the client using .Net, for example, or using
an extended stored procedure. SQL Server has the T-SQL BINARY_CHECKSUM()
function, but it has some limitations generating the same checksum for a
few different values. However, the T-SQL function would be quite easy to
implement using an INSERT/UPDATE trigger on the table.
select BINARY_CHECKSUM('ABC123'), BINARY_CHECKSUM('aBC123')
|||Steve,
I think you must get the data size down to 900 characters or less,
otherwise you will not be able to guarantee uniqueness.
As for the searching: if you have to search in the texts (for example
LIKE '%some text%'), then an index is only useful if the data column is
narrow in comparison to average row size. That is probably not the case
here, which means a table scan (or clustered index scan) is probably
most efficient.
If it is relatively small in comparison to the average row size, then
you could create a separate table with only a key column and the
varchar(2048) column, with a one-to-one relation to the original table.
HTH,
Gert-Jan
steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.
|||Thanks for the replies guys.
We were originally using a hash for the key as David suggested, but
thinking back to my days at university when we had to program hash
algorithms, they dont gaurantee unquiness. In which case the insert
would cause a sql exception and we wouldn't be able to store certain
data strings with non-unique hashes. That would be bad.
In the end, I think we're just going to have to reduce the field to 900
characters as Gert-Jan has suggested. I don't think there's any other
way around it.
steve
|||why didnt u use the text datatype..any reason
|||i need to be able to put a unique index on this field, text fields do
not allow that.

how to index large varchar field

Here's the situation: we have a character based field of up to 2048
characters in length that we want to store in a single column in a
sqlserver table. need to be able to search on this field and guarantee
uniqueness. it would be nice if this field has an index with a unique
constraint. the problem is that sqlserver can't index anything over
900 characters in length.
any suggestions appreciated.
we have a couple thoughts so far:
1 - compress this field in the data access layer before saving in sql -
that should keep the length of the field under 900 chars and let us
apply a unique index in sql. however, we would still have the
uncompressed, unindexed version of the field in another column
searching.
2 - split the field into multiple columns each under 900 characters.
apply an index to each column. however, we'd lose the uniqueness
constraint and this would complicate searches.
3 - combination of 1 and 2.
by the way - we're using SQL Server 2005.In an effort to maintain uniqueness you could investigate creating three
additional columns (varchar (683)) then popultate the data from the field in
question evenly across the new columns. To ensure uniqueness you could
created a concatenated key with the three new columns. Just a thought....
--
Thomas
"steve.c.thompson@.gmail.com" wrote:

> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.
>|||I was thinking the same thing Thomas. I created the three additional
columns and the tried to create the composity key, but sql won't create
an index across the three fields since the sum of the field lengths is
greater then 900.
steve.|||steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and
> guarantee uniqueness. it would be nice if this field has an index
> with a unique constraint. the problem is that sqlserver can't index
> anything over 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql
> - that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
Compression means you're dealing with binary data and you're not going
to get a lot of compression on a max of 2K worth of data.

> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
Adds additional management.
I would recommend that you hash the data and generate a 32-bit hash
value. You can do this from the client using .Net, for example, or using
an extended stored procedure. SQL Server has the T-SQL BINARY_CHECKSUM()
function, but it has some limitations generating the same checksum for a
few different values. However, the T-SQL function would be quite easy to
implement using an INSERT/UPDATE trigger on the table.
select BINARY_CHECKSUM('ABC123'), BINARY_CHECKSUM('aBC123')|||Steve,
I think you must get the data size down to 900 characters or less,
otherwise you will not be able to guarantee uniqueness.
As for the searching: if you have to search in the texts (for example
LIKE '%some text%'), then an index is only useful if the data column is
narrow in comparison to average row size. That is probably not the case
here, which means a table scan (or clustered index scan) is probably
most efficient.
If it is relatively small in comparison to the average row size, then
you could create a separate table with only a key column and the
varchar(2048) column, with a one-to-one relation to the original table.
HTH,
Gert-Jan
steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.|||Thanks for the replies guys.
We were originally using a hash for the key as David suggested, but
thinking back to my days at university when we had to program hash
algorithms, they dont gaurantee unquiness. In which case the insert
would cause a sql exception and we wouldn't be able to store certain
data strings with non-unique hashes. That would be bad.
In the end, I think we're just going to have to reduce the field to 900
characters as Gert-Jan has suggested. I don't think there's any other
way around it.
steve|||why didnt u use the text datatype..any reason|||i need to be able to put a unique index on this field, text fields do
not allow that.sql

how to index large varchar field

Here's the situation: we have a character based field of up to 2048
characters in length that we want to store in a single column in a
sqlserver table. need to be able to search on this field and guarantee
uniqueness. it would be nice if this field has an index with a unique
constraint. the problem is that sqlserver can't index anything over
900 characters in length.
any suggestions appreciated.
we have a couple thoughts so far:
1 - compress this field in the data access layer before saving in sql -
that should keep the length of the field under 900 chars and let us
apply a unique index in sql. however, we would still have the
uncompressed, unindexed version of the field in another column
searching.
2 - split the field into multiple columns each under 900 characters.
apply an index to each column. however, we'd lose the uniqueness
constraint and this would complicate searches.
3 - combination of 1 and 2.
by the way - we're using SQL Server 2005.In an effort to maintain uniqueness you could investigate creating three
additional columns (varchar (683)) then popultate the data from the field in
question evenly across the new columns. To ensure uniqueness you could
created a concatenated key with the three new columns. Just a thought....
--
Thomas
"steve.c.thompson@.gmail.com" wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.
>|||I was thinking the same thing Thomas. I created the three additional
columns and the tried to create the composity key, but sql won't create
an index across the three fields since the sum of the field lengths is
greater then 900.
steve.|||steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and
> guarantee uniqueness. it would be nice if this field has an index
> with a unique constraint. the problem is that sqlserver can't index
> anything over 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql
> - that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
Compression means you're dealing with binary data and you're not going
to get a lot of compression on a max of 2K worth of data.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
Adds additional management.
I would recommend that you hash the data and generate a 32-bit hash
value. You can do this from the client using .Net, for example, or using
an extended stored procedure. SQL Server has the T-SQL BINARY_CHECKSUM()
function, but it has some limitations generating the same checksum for a
few different values. However, the T-SQL function would be quite easy to
implement using an INSERT/UPDATE trigger on the table.
select BINARY_CHECKSUM('ABC123'), BINARY_CHECKSUM('aBC123')|||Steve,
I think you must get the data size down to 900 characters or less,
otherwise you will not be able to guarantee uniqueness.
As for the searching: if you have to search in the texts (for example
LIKE '%some text%'), then an index is only useful if the data column is
narrow in comparison to average row size. That is probably not the case
here, which means a table scan (or clustered index scan) is probably
most efficient.
If it is relatively small in comparison to the average row size, then
you could create a separate table with only a key column and the
varchar(2048) column, with a one-to-one relation to the original table.
HTH,
Gert-Jan
steve.c.thompson@.gmail.com wrote:
> Here's the situation: we have a character based field of up to 2048
> characters in length that we want to store in a single column in a
> sqlserver table. need to be able to search on this field and guarantee
> uniqueness. it would be nice if this field has an index with a unique
> constraint. the problem is that sqlserver can't index anything over
> 900 characters in length.
> any suggestions appreciated.
> we have a couple thoughts so far:
> 1 - compress this field in the data access layer before saving in sql -
> that should keep the length of the field under 900 chars and let us
> apply a unique index in sql. however, we would still have the
> uncompressed, unindexed version of the field in another column
> searching.
> 2 - split the field into multiple columns each under 900 characters.
> apply an index to each column. however, we'd lose the uniqueness
> constraint and this would complicate searches.
> 3 - combination of 1 and 2.
> by the way - we're using SQL Server 2005.|||Thanks for the replies guys.
We were originally using a hash for the key as David suggested, but
thinking back to my days at university when we had to program hash
algorithms, they dont gaurantee unquiness. In which case the insert
would cause a sql exception and we wouldn't be able to store certain
data strings with non-unique hashes. That would be bad.
In the end, I think we're just going to have to reduce the field to 900
characters as Gert-Jan has suggested. I don't think there's any other
way around it.
steve|||why didnt u use the text datatype..any reason|||i need to be able to put a unique index on this field, text fields do
not allow that.

Wednesday, March 21, 2012

How to include multiple membes in an expression?

Hello, I'm wondering how can I include multiple members from a dimension in a Calculated Measure Expression?

Good (single member) ([ActionGroup].[ActionTag].&[147300], [Measures].[Unique Visitors])

#Value Error (multiple members)

([ActionGroup].[ActionTag].&[147300].&[107139], [Measures].[Unique Visitors])

({[ActionGroup].[ActionTag].&[147300], [ActionGroup].[ActionTag].&[107139]}, [Measures].[Unique Visitors])

Any pointer is appreciated

-Lawrence

Is this what you are looking for ?

Aggregate({[ActionGroup].[ActionTag].&[147300], [ActionGroup].[ActionTag].&[107139]} , [Measures].[Unique Visitors])

How to include a single quote in a sql query

Hi

Declare @.Customer varchar(255)
Set @.Customer = Single quotes + customer name + single quotes

Select Customerid from Customer Where name = @.Customer

I have a query written above, but i was not able to add single quotes to the set statement above. Can i know as how to go about it?

Early reply is much appreciated.

Thanks!

You can not write?

set @.Customer = 'customer name'

What is the problem with writing that? Do you get an error?

|||

Nopes, here iam using a variable called "customer name" to which values will be passed in dynamically,

so the query is like

set @.Customer = single quotes + customer name(variable) + single quotes

Hope it is clear, or else if you need more information let me know.

Thanks!

|||

try this....

'''' + Customer Name + ''''

If you want to give the Single Quote on String Litteral you need to use 2 Single Quote Continuously..

-mani

|||set @.Customer = '''' + CustomerName + ''''|||

hey...

char(39) is the ascii for single quotes...

declare @.sql varchar(110)

set @.sql = 'select char(39)+name1+char(39) from test_name'

exec (@.gg)

--assuming test_name has 2 records mak and robin , so the output is

'MAK'

'ROBIN'

|||

edukulla wrote:

Nopes, here iam using a variable called "customer name" to which values will be passed in dynamically,

so the query is like

set @.Customer = single quotes + customer name(variable) + single quotes

Still not clear, a few more questions unless the other replies helped you.

What kind of variable is customer name?
How do you want to execute the SQL statements?
If you are doing this in a programming language, what programming language?

|||

Hello,

If your issue is that you are having difficulties finding a way to deal with character string which may contain one or more single quotes, then the solution is NOT to surround the string with single quotes as a previous user suggested. This will only work if there is in fact onle one single quote in your string such as O'Brian. It will not work if there are multiple quotes such as Here's O'Brian.

SET QUOTED_IDENTIFIER OFF

DECLARE @.s VARCHAR(100)

SET @.s = " Here's O'Brian and some quotes: ''''''''' "

PRINT @.s

Cheers,

Rob