Incorrect number of characters during move from access table to SharePoint lists (1 Viewer)

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
Hi everyone,

I want to convert the back end data tables of my split database (Access 2013) into SharePoint 2013 lists.

The first step I did was to make those tables local to the database then build the database sub-site on SharePoint.

I then proceeded to export those tables to SharePoint by using the Database Tools >SharePoint button and pointed the target lists location to the newly created sub-site.

The process runs OK but when I take a look under the tables in the database, I see a local table named "Move to SharePoint Site Issues" that contains a single record with the following fields:
Issue: Some data will not be moved.
Reason: "SharePoint only supports 8192 characters in memo fields.
Object Type: tblMyData
Field Name: Notes

The Notes field is "Long Text" Data Type and when I exported the table to a Spreadsheet the largest has a character count of only 4000.

When I examine the data further I noticed that there are a few carriage returns where the user has hit "Enter" key to separate their notes by a date.

Does anyone know what might be causing this error?

Thanks in advance.
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Tomorrow, 03:12
Joined
May 7, 2009
Messages
19,169
You already has the answer.
 

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
You already has the answer.

Do you mean that the "Enter" keys are the equivalent of 4192 characters?
There wouldn't be more than 20 of them in each memo field.
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:12
Joined
Apr 27, 2015
Messages
6,286
I had similar issues with my move. Did you do a compact and repair before your migration?

Also, to identify which records were the culprit, I did a query on my memo fields. For example in my query grid I used FieldLength:Len([myMemoField]). The results were a character count which I was able to sort from high to low. Turns out I had a few records with a count of over 10k.

Paring those records down is easy too if you need too. Good luck and let me know if you need to alter some of your records.
 

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
I had similar issues with my move. Did you do a compact and repair before your migration?

Also, to identify which records were the culprit, I did a query on my memo fields. For example in my query grid I used FieldLength:Len([myMemoField]). The results were a character count which I was able to sort from high to low. Turns out I had a few records with a count of over 10k.

Paring those records down is easy too if you need too. Good luck and let me know if you need to alter some of your records.

Thanks for your help. I have just tried a compact and repair however the error still persists. I'm thinking that perhaps I need to get to the data tables whilst there are no users accessing the database to ensure just in case that is affecting something.

In order to identify which records had more the 8192 characters in the memo field I exported the table to excel then did a LEN(cell#) then sorted by greatest value in a similar way you used a query.

I'll try the query method just in case that works better/differently.

[EDIT]Hey NauticalGent, I tried using the LEN function inside a query and that returned 11 records with values over 8192 characters so I guess there is a difference between how ACCESS and Excel identifies characters. Thank you very much for your help with this.
Now all I have to do is figure out how to get the database to display the carriage return in the text box on the form. It seems to have dropped off and sentences are no longer separated by a carriage return or a space even.
[/EDIT]
 
Last edited:

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:12
Joined
Apr 27, 2015
Messages
6,286
Now all I have to do is figure out how to get the database to display the carriage return in the text box on the form. It seems to have dropped off and sentences are no longer separated by a carriage return or a space even.

If there were there before, SP must have removed them during the migration process. I seem to remember something similar and decided I would change them as I encounter them and live with it.

One thing you should be aware of is that when you "make that move" is that SP makes the default length of each text filed 255. You can go in and edit them so that your lists are not so large. This will increase performance significantly.

Here is a BLOG I stumbled across a couple of months back that has helped me a bit:

https://accessexperts.com/blog/2011/06/15/sharepoint-and-access-how-do-they-fit-together/

Great info that I just could not find anywhere else. The author co-wrote a book which I have ordered but not received yet, hopefully soon.

Anyways best of luck and if you happen to pick a few tips and tricks, please share!
 

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
Anyways best of luck and if you happen to pick a few tips and tricks, please share!

Hey thanks for all your help,

I have moved the data to SharePoint and after many hours tinkering and playing around I think I may have stumbled on solution regarding the number of character restriction. As a part of the migration process I had ran the query that identified all the records that exceeded 8192 and then rather than copying the data to a spreadsheet or table, I exported that query to a SharePoint list.
Voila! that worked like a charm and it got me thinking that perhaps I could have used a query to export the entire original table to a SP list.

Now I'm having trouble maintaining up to date data as the lists are not updating frequently enough to allow the allocation of unique customer IDs. I'm using the RefreshLink code as you can see below:

Public Function NewCustID() As Long

On Error GoTo NextID_Err

Dim lngNextID As Long

CurrentDb.TableDefs("tblPracticeEmployeeData").RefreshLink

'Find highest Employee ID in the tblPracticeEmployeeData table and add 1
lngNextID = DMax("[lngID]", "tblPracticeEmployeeData") + 1

'Assign function the value of the Next ID
NewCustID = lngNextID

'Exit function now after successful incrementing or after error message
Exit_NewCustID:
Exit Function

'If an error occurred, display a message, then go to Exit statement
NextID_Err:
MsgBox "Error " & Err & ": " & Error$

Resume Exit_NewCustID

End Function

That works for a single user but when multiple users create new records simultaneously only one is successful the others get kicked off and get the message "This item could not be inserted or updated because duplicate values were found for one or more fields in the list.". Do you have any suggestions to overcome this issue?
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:12
Joined
Apr 27, 2015
Messages
6,286
Just to be clear, once you exported the table to SP with a query, you linked the list, correct?

If I have that right, then you should be using the ID field that SP uses (equivalent to Access’s PK/Autonumber) as your Customer ID. This will alleviate the need to do the Dmax+1 bit.

Also, be sure to Cache option that is addressed in the same article I provided. The performance boost is phenomenal - I have remote users in the Middle East and Spain that are able to work in real time mode with virtually zero latency.
 

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
Just to be clear, once you exported the table to SP with a query, you linked the list, correct?

If I have that right, then you should be using the ID field that SP uses (equivalent to Access’s PK/Autonumber) as your Customer ID. This will alleviate the need to do the Dmax+1 bit.

Also, be sure to Cache option that is addressed in the same article I provided. The performance boost is phenomenal - I have remote users in the Middle East and Spain that are able to work in real time mode with virtually zero latency.


Yes that is correct. I had to link the list afterwards.

I was using the autonumber PK ID field however I have had to remove that as I was advised it was not good practice to have an autonumber as a meaningful field. So I had to programatically create autonumbering using Dmax+1.

Also if I have a PK Autonumber field with the name "ID" that is reserved for SP what implications would that have on the data once "OLD_ID" is created during migration?
 
Last edited:

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:12
Joined
Apr 27, 2015
Messages
6,286
I was using the autonumber PK ID field however I have had to remove that as I was advised it was not good practice to have an autonumber as a meaningful field. So I had to programatically create autonumbering using Dmax+1.

I appears you have experienced the exception to that "advice" first-hand!
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:12
Joined
Apr 27, 2015
Messages
6,286
Also if I have a PK Autonumber field with the name "ID" that is reserved for SP what implications would that have on the data once "OLD_ID" is created during migration?

Sorry, I missed this bit completely. SP will keep your PK field as long as it is an Autonumber...however...it will re-index it on migration and start over from 1. If you don’t have any gaps (deleted records) then it should synch. But even so, SP will make a new field and name it _oldID and store the prior values there.

If you have your relationships set up correctly, by that in mean one-to-many with enforced integrity, then it will move all those tables as well and ensure you FK on related tables match up with the master table.

Just be sure to make a backup of everything before you migrate- just in case!
 

ram_rtg

Registered User.
Local time
Tomorrow, 05:42
Joined
Jun 18, 2015
Messages
47
Just to be clear, once you exported the table to SP with a query, you linked the list, correct?

If I have that right, then you should be using the ID field that SP uses (equivalent to Access’s PK/Autonumber) as your Customer ID. This will alleviate the need to do the Dmax+1 bit.

Also, be sure to Cache option that is addressed in the same article I provided. The performance boost is phenomenal - I have remote users in the Middle East and Spain that are able to work in real time mode with virtually zero latency.

Hey NauticalGent, Thanks for your help so far. I have managed to resolve the DMax issue by incrementing the value of the customer ID by one instead of running Dmax+1 again.

The reason why it was failing (I think) was because there was no real-time or persistent connection to SP. Even RefreshLinks for the linked tables didn't help.

This is still an issue I think because record locking isn't being enforced even though I have form and query level "Edited Record" locking the behavior I am seeing is similar to that of No Locks.

For example:
User "A" goes into a record and starts typing. Then User "B" goes into the same record and starts typing and saves the record without any prompts. Then when user "A" tries to save the record they are presented with the option to drop changes, copy their changes to clipboard or save changes overwriting user "B"s changes. This is the behavior of "No Locks" option.

There seems to be no way to overcome it. The only options is to track changes by enabling version tracking in the SP list but this will result in data loss because the database only shows the current version.
 

Users who are viewing this thread

Top Bottom