"when I try to change a column in the filter buffer"
Are you changing data in the primary buffer, and then filtering it out and then doing the update?
"when I try to change a column in the filter buffer"
Are you changing data in the primary buffer, and then filtering it out and then doing the update?
replace the cursor with a datawindow/datastore.
Retrieve the data into the datawindow/store; then either use another datawindow/store to add the rows, or you can still call the insert.
I just did a quick test in the Database Painter. When you view a Table's data contents the Results data area is really using a DW Object.
The steps I did ...
1) Retrieve data
2) Modify a column in a row & tab out
3) Set a filter for the row that was modified (It now disappears - as expected as its in the Filter buffer).
4) Press the SAVE on the DB Painter's toolbar (uses Update method internally).
5) Retrieve data
6) Remove the Filter
7) The modified column's new value is present => therefore the Update must have worked on the Filter buffer!
Validated:
PB 12.6: OK
PB 12.5.1: OK
PB 12.1: OK
HTH
Thanks Chris...
Need to be more specific.. Change the data in the filter buffer itself not while its in the Primary buffer then applying the filter.
i.e dw.object.columnname.filter[row#] = value
I was trying to make changes so that I did not have to remove the filter from the dw before updating.
Hope this helps...
Is this the best and quickest way to accomplish this using PB? Or would using a database driven approach be better?
Hi Jeff:
ASE driver updates over SYC (AFAIK):
1) Yes, larger entity names - but, actually now up to 256 characters.
2) BIGTIME: Includes the hour, minute, second, and fraction of a second. The fraction is stored to six decimal places.
3) BIGDATETIME: Includes the year, month, day, hour, minute, second, and fraction of a second. The fraction is stored to six decimal places.
HTH
Regards ... Chris
Why are you bringing all that data back to the client in the first place?
I would write that as a stored procedure and do it ALL on the server... The only thing you need to return from that is maybe an error code and diagnostic text if something goes wrong.
Paul's tried and true rule of thumb: If the user doesn't need to see and interact with the data, don't waste their time retrieving it.
-Paul-
Hi David;
That could be the variation ... updating the row in the Filter buffer - which I never do.
My suggestion would be to retrieve the result set into a DataStore and then perform a ShareData of its primary buffer with the DW Control. The Filter on the DC will not affect the DS's primary buffer. Then change the updating mechanism to only use the DS. That way, the DS's row data should always update the DB properly - but the user still can see the DC's primary buffer (which can be different than the DS's under a ShareData) with a DC.Filter() in effect.
Food for thought.
Regards ... Chris
Is the column selected as "Updateable" in Rows > Update Properties? PB doesn't bother with update flags unless the column is listed as updateable.
-Paul-
Sure, but you save all the work of marshalling the result set, putting it on the wire, transmitting it to the client, having the client reserve memory space to store it, and so on...
And then vice-versa - when transactions are sent back from the client to the server. None of that happens either.
With stored procedures, all of that happens within the server.
I would code the stored procedure to select the data into a temp table. Then cursor through the temp table and do the insert/updates within that loop.
A cursor is essentially a temp table. Why pull all that data into a temp table, only to open a cursor (another temp table) right on top of it?
That approach would double the amount of memory that the server has to reserve for the result set.
Hi.
Try this change in the "menu items" section:
lb = luo_app.menuitemexecute('Edit')
lb = luo_app.menuitemexecute('SelectAll')
lb = luo_app.menuitemexecute('Edit')
lb = luo_app.menuitemexecute('Copy')
HTH,
Luiz
In Oracle I would use a BULK COLLECT to get the data into an array. No need for slow cursor processing.
Ugh. Oracle. I feel a "bulk collect" about to release itself right now...
It isn't really clear what you want. Do you want to grab the text from a PDF and store the text in the database?
Interesting issue, David, thanks for responding.
According to the help file (search "disablebind"):
DisableBind=1 for ADO.NET, ASE, SYC, SNC, and OLE DB, DisableBind=0 for other interfaces
So, as I'm reading it, binding is turned OFF for any Sybase connection by default.
(As an aside, that begs the question: Why did they do it that way? Is there something about those connections that is better without binding?)
Since I've never manipulated disablebind, I'm thinking that I'm using your fix by default. But Chris' comment sounds like he prefers to use binding.
Finally, it looks like there is a functionality difference. (The help file goes into quite a bit of detail concerning the difference to database default values.) For that reason I'm thinking I don't want to fool with that switch without a real good reason, as doing so would require a lot of code review and testing.
Any additional insight that might soothe my inner nerd would be appreciated.
Jeff
You should use pipelined table functions, it helps relieved that bulk collection feeling.
I would recommend a private snow cone instance to address any array irritation and discomfort once the bulk collection relief is complete.
Hello Sandra,
This is a known problem with a specific version of PB 12.6:
Read this thread for more details: http://scn.sap.com/thread/3622189
HTH
Jacob