Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SLOSifl
Aug 10, 2002


Are you looking for data that occurred before TODAY or NOW?

edit: Oh, you had < when you needed >? Sounds like your boss could work as a bug tester at my office. "There's a bug :colbert:"

Adbot
ADBOT LOVES YOU

SLOSifl
Aug 10, 2002


Victor posted:

Last time I tried SQL Prompt, I found that it requires you to type ctrl-space or something to activate the dropdown (or the dropdown would appear way too often), wouldn't autocomplete after pressing space, and other such annoyances. Maybe I've just acclimated to VS2005's intellisense, but I really like it, more than any other intellisense I've used. Oh, I don't think SQL Prompt does autocompletion for INFORMATION_SCHEMA, which I use a lot.
I had a problem with SQL Prompt because it would just sort of disappear every once in a while. It would work fine for a while, and then it wouldn't show up anymore.

Regarding nulls, as mentioned by Victor, they always evaluate to unknown. "null is null" is true, of course, but "null = null" is unknown. Anything plus null is null as well. This can be confusion if you don't know the behavior. The main problem is expecting "null = null" to work and using it in join or where clause.

code:
SELECT FirstName + ' ' + LastName as [Name] FROM Customers
If either of those are null, the whole statement is null. This is why you have to use COALESCE or some flow control to handle potentially null columns.

SLOSifl
Aug 10, 2002


I just got a pretty hosed up email from work. We're making some pretty big cutbacks (a lot of people lost their job, and it's a small company), so I'm looking for ideas to help out.

We have a lot of in-house toolkits that my coworker and I have developed to make our lives easier. One of them is a pretty powerful database abstraction layer for MSSQL 2005, 2008 and 201x.

It's a versioned database abstraction layer that emits and consumes only XML, handles all table relationships automatically (including bridge objects - eg Object A can be associated with Object B through Object C), with a Web Service and .NET API. It also provides events (web service calls, email, etc), has tons of filtering options, and can attach to any number of databases.

We use it to lessen the annoyance of an industry standard database, and it's actually faster than even well optimized queries most of the time.

The feature list is very long, and the amount of things it does for us is crazy.

I'm considering talking to the CEO about packaging it as a distinct product. Of all you SQL people, does anyone think this is a sellable product for use as middleware? If you could work with an annoying database through well defined XML based objects, would that help you?

I'll be happy to field any questions here if appropriate, via PM or AIM, or by email. I'm not looking for actual buyers, I'm just curious if such a tool could be marketed in the near future. I feel bad even posting this, but since I have to take a 10% pay cut just to keep my job, I think it's worth probing for interest.

SLOSifl
Aug 10, 2002


Yeah, I'm looking around at other products right now. This sucks, I don't know what to do if the company I've spent 10 years at goes under. :(

SLOSifl
Aug 10, 2002


Touche.

I haven't found any competing products yet that do anything close to what ours does yet. I'm not sure if that's good or bad, since it makes it hard to determine whether there's a market or not.

SLOSifl
Aug 10, 2002


You can map them through a third table if you can't update table B.
code:
declare @id int

-- insert into B
select @id_b = @@identity

insert into MapAtoB ( ID_A_a, ID_A_b, ID_A_c, ID_B ) values ( @id_a_a, @id_a_b, @id_a_c, @id_b )

SLOSifl
Aug 10, 2002


It looks like the problem is SQL Server can't open the backup file. Set the permissions for YourComputerName\NetworkService or Everyone to allow read or full access for that .bak file.

SLOSifl
Aug 10, 2002


Apparently it doesn't.

Give SYSTEM full permissions to the file, or Network Service, depending. Neither are covered by "Users".

SLOSifl
Aug 10, 2002


Ah, that makes sense since you don't appear to be using the default SQL data folder (which is fine). I just assumed you were restoring to a folder SQL had access to.

SLOSifl
Aug 10, 2002


Victor posted:

Try/Catch helps some, but I honestly do not know if the transaction will be rolled back if your CATCH block is hit. I advise you to test this stuff. That's how I figure out a lot of things...
It won't be rolled back.

edit: Unless you rollback the tran in there. Here's a safer pattern:
code:
BEGIN TRAN
BEGIN TRY
    -- Query code
    COMMIT TRAN
END TRY
BEGIN CATCH
    ROLLBACK TRAN
    PRINT ERROR_MESSAGE()
END CATCH

SLOSifl fucked around with this message at 16:45 on Mar 12, 2009

SLOSifl
Aug 10, 2002


Are you trying to update existing rows or is allPosts empty?

If you're just dumping a single column from a table into allPosts (which is empty), then you need DISTINCT after the SELECT.

SLOSifl
Aug 10, 2002


INSERT INTO WhateverTable ( Column1, Column2, Column3 )
SELECT [Column1], [Column2], [Column3] FROM INSERTED

SLOSifl
Aug 10, 2002


No, there's no UPDATED pseudo-table. It's always called INSERTED.

Yes, it'll work on a batch operation.

SLOSifl
Aug 10, 2002


I'm a bit of a SQL-CLR whore, so I'm sure there's a pure T-SQL method I'm ignoring. If you want to just use T-SQL then you can write a SP that does basically the same thing. Or Google for T-SQL string split and see where that gets you.

You could make a CLR Stored Procedure that runs your query, parses the [names] column and adds the values to a List<string>, returning the name using SqlContext.Pipe.Send (SqlDataRecord) if the list didn't already contain that name.
code:
[Microsoft.SqlServer.Server.SqlProcedure]
public static int GetNames() {
   // Create a list to keep track of names you've encountered
   var nameList = new List<string>();

   // Create a SqlDataRecord
   var metadata = new SqlMetaData("Name", SqlDbType.NVarChar, 40 );
   var outputRecord = new SqlDataRecord ( new SqlMetaData[] { metadata } );

   using ( var conn = new SqlConnection ("context connection=true") ) {
      using ( var cmd = conn.CreateCommand() ) {
         cmd.CommandText = "SELECT [names] from [Table]";

         using ( var reader = cmd.ExecuteReader () ) {
             while ( reader.Read () ) {
                // Skip the row if it's null
                if( reader.IsDbNull ( 0 ) ) continue;

                var nameColumn = reader.GetString ( 0 );
             
                // Split the values into an array
                foreach ( var name in nameColumn.Split ( ',',
                            StringSplitOptions.RemoveEmptyEntries) ) {
                   // Ignore names you've seen already
                   if( nameList.Contains ( name.ToLower() ) ) continue;
                 
                   // Add the name to the list
                   nameList.Add ( name.ToLower() );

                   // Set the record's value and spit it out
                   outputRecord.SetString ( 0, name );
                   SqlContext.Pipe.Sent ( outputRecord );
                }
             }
         }         
      }
   }

   return nameList.Count;
}
Something like that. I don't expect that code to work since I wrote it off the top of my head.

SLOSifl fucked around with this message at 21:20 on Apr 30, 2009

SLOSifl
Aug 10, 2002


Triple Tech posted:

And yet neither of you mention how that type of data is set up poorly for relational databases?
Absolutely, but there are plenty of cases where changing the DB schema or the way some piece of poo poo program interacts with it is a non-starter. I just assumed he had to deal with it in some one-off type of export situation just to get it over with.

If you frequently need to pull unique names from that mess, then yes, you're better off rethinking it. One option would be to make a surrogate table and throw a trigger on your existing one. On insert, update or delete, update your normalized table. Remember that the pseudo-table in *both* insert and update triggers is called INSERTED and you'll be fine.

My CLR solution posted on the previous page will get you your list and should be basically working. 40,000 rows isn't a lot. At least with a CLR option, you're processing the data before it ever leaves the server. Doing it client side is a perfectly workable option as well, as long as you don't have to do it that often.

SLOSifl
Aug 10, 2002


SQL Express is great. Other than having an integrated SQL Agent for job scheduling and 4GB database file size limits, it's an extremely good free database with great tools.

Adbot
ADBOT LOVES YOU

SLOSifl
Aug 10, 2002


Anaxandrides posted:

I can say that this is 100% possible to write in MSSQL. It's possible to write many, many things that you should not write, though, and this is one of them. You're much better off having multiple UPDATE statements with differing WHERE clauses than having one massive UPDATE that selectively updates based on the value in the row -- isn't that what WHERE clauses are for?
Agreed. The complexity of the logic and the wide-net WHERE clause, if any, will be an indexing nightmare. You'll get better performance out of tight WHERE clauses and a few column updates.

What is the concern about updating columns that aren't changing? The only case where that could theoretically matter is if you update rows that don't need any updates at all, but even that can be offset by an efficient WHERE clause. Just update all the columns if any of them have changed...or do you have triggers on the table that care if certain columns were touched?

code:
UPDATE A SET Column1 = Val1, Column2 = Val2            --update everything
FROM TableA A WHERE Column1 <> Val1 OR Column2 <> Val2 --if anything changed
is probably the best option under most normal circumstances. I use the UPDATE X FROM Table X syntax under the assumption that you are joining on another table somewhere. There is no penalty to this pattern but it is easier to extend later.

SLOSifl fucked around with this message at 13:12 on Sep 9, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply