Perforce Public Knowledge Base - Deleting a Workspace Specification with a Large Database Footprint
Reset Search
 

 

Article

Deleting a Workspace Specification with a Large Database Footprint

« Go Back

Information

 
Problem

When deleting a workspace specification the Perforce Server carries out several individual tasks. For some workspaces that have a large number of associated database records - whether it has a large view, many files synced or opened - this may have a performance impact on other operations. By identifying the various pieces of this process and, by carrying them out manually and individually, the performance impact of deleting such workspaces can be substantially reduced.

Note: Deleting a user workspace from the Perforce server is covered in our on-line documentation, as well as the more specific case of Reverting Another user's files detailed in our knowledge-based Reverting Another User's Files.

Solution

When deleting a workspace from Perforce, additional related metadata about the state of the workspace must be removed too. The Perforce Server checks the database to identify those files that are shelved in this workspace, those that are opened, and those synced. These records all need to be removed before the workspace specification can be deleted.

Shelved files

It is not possible to delete a workspace if it contains any shelved files. As explained in Reverting Another User's Files, you must delete these files manually using the p4 shelve command.

Opened files

As an administrator, running the p4 opened command identifies all files opened on a specific workspace:

p4 opened -a -C workspace_name

 

Running p4 revert -k will allow you to revert the files associated with this workspace. For example, if the files opened against this workspace indicates that the user is "bruno", and assuming a super user account "admin", run the commands:

p4 -u admin login bruno

p4 -u bruno -c workspace_name -H host_name revert -k //workspace_name/...

Note: The "-k" flag allows the super user to revert the workspace files without requiring a local sync of those files. The "-H" flag uses the same host name noted in the workspace specification.

Synced files

You can remove files from the "have" list for this workspace using a command similar to:

p4 -u admin -c workspace_name -H host_name sync -k //workspace_name/...#none

Note: As with the "revert" command, the "-k" flag ensures that only the server metadata is updated without requiring a local file update.

To get a list of all files synced to the workspace, you can run p4 have for this workspace:

p4 -u admin -c workspace_name -H host_name have

A large number of files may require a sync to #none for smaller sections. For example, if there are a large number of files synced from depot "foo" and depot "bar", each of the following commands can break this down into smaller tasks:

 

p4 -u admin -c workspace_name -H host_name sync -k //foo/...#none
p4 -u admin -c workspace_name -H host_name sync -k //bar/...#none

 

Removing the Workspace Specification

After identifying and removing shelved files, opened files, and synced files from the workspace, the only remaining metadata relating to the specification is that of the workspace itself - the owner, root, view, and so forth. With the other metadata pertaining to files removed, running the command, "p4 client -d workspace_name"  to remove the workspace will complete much faster.

If the workspace view is large or complex, it may still take some time to complete. This is because the Perforce Server still runs the same process to search for records related to the workspace by searching through revisions, protections, and so on. That said, the time needed to find and remove related records from those assorted database tables has been eliminated.

Note: If you are still experiencing problems when trying to remove your workspace specification, please contact Perforce Support.

Related Links

Feedback

 

Was this article helpful?


   

Feedback

Please tell us how we can make this article more useful.

Characters Remaining: 255